For a while now I’ve been dying to share some of the new nanostructures I’ve discovered during ion-assisted vapor deposition of CI(G)S thin films. My paper was published in the Journal of Applied Physics last fall (sorry for the blog outage over the winter-break). The paper is titled: “Nanostructured light-absorbing crystalline CuIn(1–x)GaxSe2 thin films grown through high flux, low energy ion irradiation;” http://dx.doi.org/10.1063/1.4823987. There are a lot of unanswered questions about the work that leaves me dreaming up different solutions to how these films are growing. Read More
Materials Science and Engineering
I have no idea why this took so long, but I finally have the code to import and output various graphs of the reciprocal space maps (RLM or Q-space) taken using the Xpert Xray diffraction system. One of the difficulties in outputting the older data has been solved by our new line-scan detector system. The data is now taken in the more simple Omega-2Theta space instead of Omega-Omega2Theta space. With the help of Mauro Sardela, the fantastic research scientist who runs the XRD lab at the Materials Research Laboratory (FS-MRL), I’ve properly translated the data into Q-space using the same equations the Xpert Epitaxy software uses.
Read More »
For years while Apple had proprietary system software, I was itching to get a Unix system underneath and have the ease of the windowing system. Well, after OsX was released, I was ecstatic. Why? Because of the ease of some tasks in Unix in comparison to other OS’s. This post is only one example of what you can do if you do a bit of research into how to use your Mac. For those who have un*x boxes, this will merely be a place-holder for a few AWK scripts for you.
Read More »
After some new scans, it appears the XRDMLread.m function I talked about is doing a pretty good job of getting the 2-axis scans into MATLAB. I was able to alter the code to accept the standard Epitaxy software’s translation to Q-space. (In Epitaxy the default is R = 0.5 I believe.) So, the following image was imported with XRDMLread.m the plotted with the standard 2-d example from the author’s website. The code was altered to output 10000xRLU units the same as Epitaxy (Panalytical). I haven’t checked all the numbers, but it’s looking ok so far. Unfortunately the color-scale looses it’s meaning as far as intensity is concerned, it appears at first glance.
Working a bit with my old q-space map code, I’m able to accomplish the following:
Note the strange love-handles the data gains. I suspect this might be due to the Gridfit function (see MATLAB files repository) I used for regridding the data. Gridfit.m uses an extrapolant method. My suspicion is it is trying to fill out the square of the data matrix and is accomplishing relatively correct values for near-by-data that is outside the scanned range. I’ll try it again with regrid or something similar in the future when I have time.
If anyone knows where the current site for XRDMLread.m is, I’d love to link to it. It appears the site may be down (graduated student I suspect). You can obtain the wonderful XRDMLread.m function and examples on the XRDMLread.m website. For now, I have to wait until I hear from the authors before I can share the file. I also don’t yet trust my icky 3D code, so I prefer not to release that until I have things hashed out. Sorry!
I’m extremely happy that Panalytical has published their XRDML file format and that the makers of XRDML.m have released their .m files for MATLAB. In the past, when Philips had the Xpert systems, the data was stuck for the most part in proprietary data formats. [You could slice the data and output in ascii- but making that work was a pain- which is why I never released that previous code.] I’m much closer to the 3d plotting now, and hope to finish it up before the thesis (my primary work which is not this plotting) is published.
Wishing you luck in you research!
Every once in a while a simple solution gracefully solves a problem that affects a large number of people. It doesn’t happen often, but when it does, it’s wonderful to see the results. This is the type of thing most scientist hope to experience at least once in their careers. I think most of us at one point in time have hoped: “please let me improve the world in some small way to make life easier for some people.”
I have to admit as a scientist I love to see graceful solutions to any problem. Prof. Josh Silver at the University of Oxford has come up with just such a brilliant simple solution that any scientist who understands index of refraction will say: “Ahhh… yes!” about. Prof. Silver has made plastic glasses with adaptive lenses for third world countries. In third world countries to get glasses right now you have to either already know (by magic) your prescription, or try and somehow happen across an optometrist (and you thought it was just about affording a roof over your head!). Unfortunately, optometrists don’t grow on trees in third world countries, and so you’re pretty much out of luck.
Enter the “dial a prescription” solution of Prof. Josh Silver’s. With his glasses you simply turn a few dials which push plungers in or out of a syringe attached to each lens. These syringes hold a fluid which has the same index of refraction as a polymer film which flexes under pressure (positive and negative pressure). This pressure of course will bow out or bow in the surface of the “lens” (which is fixed on the edges to a certain thickness). How to know your prescription? Simple- is it clearer or not? [Much like the old A or B, A or B, A or B, 1 or 2, 1 or 2 hassle we all go through at the optometrist's office- but cut out all that binary testing... just dial it in or out- bam, there's your prescription- in fact, it's fractions of diopters even- so it's much more analog than the current system.] Pure brilliance. Such a simple problem to a complex issue.
The impact to those who can’t see? Huge! I suggest it’s almost as huge as teaching someone how to farm. People are illiterate because they can’t see things clearly enough to learn how to read or write. Prof. Silver’s solution has the possibility of changing all of that.
So, my hat is off to him as a scientist- excellent work, and much needed work!!
Here are some links to the adaptive lenses:
- Guardian: Inventor’s 2020 vision: to help 1bn of the world’s poorest see better
- Prof. Josh Silver’s Organization: Center For Vision In The Developing World
And here are some talks that highlight these new adjustable lens glasses:
If you’ve been moved by this simple idea, consider sending a donation to them to help support vision for those around the developing world: Donate.
In the time I’ve been doing my research work at the Univ. of IL, I’ve come across a number of graphs from various past researchers, older papers, stuck on the side of machines (calibration curves), and even hand-drawn or chart-recorder graphs in my numerous projects. The only major problem with those graphs I’ve found is that they aren’t in a digital form for further use with other data (instrument response functions) or to include in your own work as a reference. So, what to do?
Well, there’s an easy solution. It’s not the perfect solution, as it’s a bit slow, I’ll get to that in a second, but it’s a great solution to the problem, and has worked for me a number of times now. To top it off, it’s open-source, donation-ware, and cross-platform: Engauge Digitizer (see post at LifeHacker.com). Don’t let the website and lack of recent updates deter you. Tools that can do what Engauge does are few and far between. So, it is definitely worth a try. Here’s an example of how I’ve used it just the other day (prompting this post- I’ve used it for years now, but the recent use reminded me I should share it with others). [click "More" to see an example use and learn more]
The society formerly known as the American Vacuum Society (AVS) is holding their international conference in Boston, MA next week! And I’ll be there! [Exciting!] If you are going to be there, drop me a line on the blog and we can snag a coffee or beer together.
The programs and cards sent out may have a familiar image!
My AFM image of the CuInSe2 Bicrystals I grow in our lab won second place in the Art-Zone competition last year! It was great fun! [Thanks goes to AVS for being a fantastic organization!]
Wish me luck with my talk!!
For the Materials Scientist, a very interesting event happened on Monday 6/9/08. The Los Angeles department of water and power inadvertantly may have created the largest Materials Science demo of crystal stacking, defects, and boundaries when they placed 400,000 balls onto the surface of the Ivan Hoe Reservoir. [Read the original news release (including video) by the LA Times (images from LA Times) or the blog post at Design Verb where I found out about it.]
The balls are being placed on the surface of the lake in order to prevent the formation of various forms of bromates [wikipedia]. The bromates are forming due to the presence of both chlorine and bromine in the water coupled with the presence of sunlight. Local public officials hope that the balls will help block sunlight from the lake-water and prevent the bromate formation. [Let's hope that heat doesn't form it either- those balls are going to get really hot with the sun blaring down on them!]
The balls on the surface of the lake is almost exactly like a model often used as a class-room demo called a “Bubble Raft” [wikipedia]. The model describes in 2 dimensions (usually) the concepts of crystallites (polycrystals), grain boundaries, and dislocations. The more common demonstration is table-top consisting of a large number of bubbles on the surface of a thick liquid like glycerin.
There are even published articles utilizing this approach as a model system for indentation measurements etc. Other common places to see this phenomonon is when you get a pop from the machine (clear-plastic bottle), you may see bubbles form various areas of ordered stacking. Of course, the structure of the pop-bottle bubble raft is quite complex- stacking faults and dislocations abound.
If you look closely at the above image, you’ll notice that there are a number of areas where the balls are all stacked nicely in 2D, and then a little further away, a different orientation of stacking… when these two meet, there’s a “grain boundary”[wikipedia]. In fact, this is exactly what a polished and etched sample of polycrystalline metal looks like in a reflected light microscope. Of course, in a sample like that, you can’t see the atoms at all, but just regions of different reflectivity. (See how the lightness/darkness of the “crystallites” are different?) Little lines in the microscope show the boundaries (often acid etching is required to see them). Of course, if you have a transmission electron microscope (TEM), for instance, you actually *can* see stacks of atoms and grain-boundaries! [There is a world-class Aberration Correction TEM available at the FS-MRL where I currently do research. Hopefully I'll be able to use it to get some excellent images to share.]
Frequently in Science we find beauty in the things around us. Along with all this beauty is some natural function or rule, often unique and interesting. All you have to do is keep your eyes open, walk a bit slower, and enjoy the walk (or bounce as the case may be)!