PtychographyPlan

Aus AG Strukturforschung / Elektronenmikroskopie

Wechseln zu: Navigation, Suche

Priorities for the direction IDES and the ptychography investigation should go. This is a list in (loose) order of importance.

  1. Wider generalized potential (GP), based off a fitting of a non-regularized reconstruction to a set of Gaussians (for example with with StatSTEM). In the meanwhile Marcel has already done this.
  2. Make all layers of the multislice equal, and in a second step make every second or third layer equal. This should help with reducing the effects of multiple scattering while at the same time preventing overfitting. So far Marcel already accomplished to make all slices equal; making each second or third equal is a work in progress. We are now looking to decide what the largest slice-thickness is that still allows for a good multiple scattering approximation. Then this slice-thickness will be used to determine the sample's thickness by adding slices until the error starts going up again.
  3. Determine the regularization parameter µ by choosing it such that the mean of the normalized squared residuals is 1.
  4. Make the protocol of the reconstruction a bit less trial-and-error by deciding which quantity (potential, probe or probe positions) to optimize next from a properly scaled derivative of the error function. In a first step we will print out these scaled derivatives when running through a good protocol that we already know works well and will just look at the derivatives.
  5. When the above is done, we can apply it to the structured illumination data from Jülich.
  6. Time-resolved ptychography. In a first attempt we will measure one set of diffraction patterns per time step. When we are getting better at it, we'll try a compressed sensing approach like IDES' video data results.
  7. Closely related with this is "single-shot ptychography," which in fact just circles back to coherent diffraction imaging (CDI) where the object is estimated from a single CBED pattern. This requires regularization, and if it works it would be great to use in time-resolved ptychography. A good application would be the imaging of tiny particles like the Pt30 clusters. Marcel so far already has promising results with this in his work on adaptive sampling.

Adaptive sampling. I didn't include this in the list because it looks like something Marcel is working on in parallel with promising results already. It can be developed further more or less independent of the above, until it is in a good enough state to be applied to a real-life application (on the Nion perhaps, or in EMAT where they can scan arbitrary patterns, other suggestions are welcome.) Showing that adaptive sampling beats the noise limit (which CS cannot) would have a large impact in the TEM literature and beyond. This might be useful in a time-resolved context too.