There exists no simple measure such as a general catalytic performance that could be monitored along a parameter such as the number of metals in a multi metal catalyst. The large variety of catalytic compounds, desired reactions, and desired optimized parameters (for example: lowest possible reaction temperature) is too diverse. For this reason alone does it make sense to focus on emergent properties such as porosity [3] and fractal dimensions [4,5] and to develop more general strategies of optimization based on general descriptions that avoid taking too many properties into account, also since they are highly interdependent anyway.
The fractal dimension of a set is strictly [6] the capacity dimension D, where the minimum number N of open sets of diameter epsilon needed to cover the set is N = epsilon-D. But ‘fractal dimension’ can generally refer to any dimension commonly used to characterize fractals (correlation dimension, information dimension, Lyapunov dimension, Minkowski-Bouligand dimension).[7]
There can be no single well defined measure that is able to capture and compare the total complexity of very different hierarchical nano-micro compounds, but something that represents complexity should always be considered. For example in the case of nano-alloying, the number and width of XRD peaks is better than mere metal-metal ratios. One should not be afraid to lose touch with more ‘physical’ parameters. All physical parameters are emergent; one shall only think of the importance of temperature T, which is meaningless with atoms considered one by one. As long as one keeps track of several parameters computationally, one can always later transform the parameter spaces’ axes until a more suitable combination suggests itself.
A systematic exploration of a parameter such as catalytic strength depending on atomic percentages of metals, takes many days. The results suffer day-to-day variations in the laboratory. Multi-sample methods are crucial time savers; laboratory experience tells that some investigations are impossible without them. This becomes the more true the more complex the structures are. With walk-in optimization, every parameter adds two more steps to be taken at every already investigated point in parameter space. With surveys, the number of samples multiplies with every added parameter. Higher complexity therefore requires higher efficiency in terms of throughput such as screening methods that prepare and analyze many different samples simultaneously.
Unsurprisingly, the biologists are ahead. Their multi-well trays hold usually tens to hundreds of samples, which is unusual in nano-materials research although individual nano-structures are much smaller than living cells. Nevertheless, nanoscience reaction vessels, for example for hydrothermal synthesis, are relatively speaking enormous.
High throughput screenings of many slightly different, simultaneously synthesized samples, especially the screening of their optical and catalytic properties, should be relatively easy to introduce via adapting equipment and methods from biochemical research. It is for example feasible to lower concentrations so that catalytic tests, monitored via gas development or color bleaching, can be effectively started all at once in hundreds of samples inside a sample tray while the taken images of the tray over time can be analyzed via image processing software. This leads again naturally to needing statistical methods and computationally enhanced methods that can deal with the amounts of data in high throughput characterization and optimization. Before we focus on computational aspects however, we support the importance of statistics once more differently, because proper statistics is perhaps the most lacking in much of nanotechnology although it is easily addressed by computational methods.
--------------------
This article continues the series “Adapting As Nano Approaches Biological Complexity: Witnessing Human-AI Integration Critically”, where the status of all this as suppressed information has been discussed. I allow myself to actually publish the most interesting and critical parts here (edited). If citing, please cite nevertheless [1] anyway in order to support the author.
The previous posts were “Complexity And Optimization: Lost In Design Space”
and "Magic Of Complexity With Catalysts Social Or Metallic".
Next week's post will continue with
"Critique of Nanotech: The most dangerous Science least carefully done"
and perhaps I also add
"A flexible, evolving approach to computing."
------------------------
[1] S. Vongehr et al., Adapting Nanotech Research as Nano-Micro Hybrids Approach Biological Complexity. J. Mater. Sci.&Tech. 32(5), 387-401 (2016) doi:10.1016/j.jmst.2016.01.003
[2] H. L. Jiang and Q. Xu, Recent progress in synergistic catalysis over heterometallic nanoparticles. J. Mater. Chem.,2011, 21, 13705.
[3] P. M. Adler and J. F. Thovert, Fractal porous media. Transp. Porous Med., 1993, 13, 41–78.
[4] P. Grassberger, Generalized Dimensions of Strange Attractors. Phys. Lett. A, 1983, 97, 227.
[5] G. C. Bushell, Y. D. Yan, D. Woodfield, J. Raper and R. Amal, On techniques for the measurement of the mass fractal dimension of aggregates. Adv. Coll. Interf. Sci., 2002, 95, 1–50.
[6] S. N. Rasband, Fractal Dimension. Ch. 4 in Chaotic Dynamics of Nonlinear Systems, New York: Wiley 1990, 71-83.
[7] H. G. E. Hentschel and I. Procaccia, The Infinite Number of Generalized Dimensions of Fractals and Strange Attractors. Physica D, 1983, 8, 435.
Comments