Description of the Analysis Printout
When the analysis is printed out you will see the sample type (st or un), sample number and name printed on the first line. If the sample is a standard, then the sample set number is also shown. The microprobe takeoff angle and the operating voltage is shown.
The next line indicates the total number of lines (or data points) that are contained in the sample. One may have 1 to 100 points in each sample. In the same line are the number of points that have status of "G" for good, meaning that this many points are undisabled. A sample with all points disabled will have the number of "G" points equal to 0 and will not be used in any calibrations or recalculations and therefore cannot be quantitatively analyzed, unless using the Analyze Selected Lines button.
Next, the average elemental totals and the total, calculated and excess oxygen is displayed. The calculated oxygen is the amount of oxygen calculated by cation stoichiometry (if selected) and the excess oxygen is the difference between the measured and oxygen calculated if the Display As Oxide option was selected. This excess oxygen is often very useful in determining if the selected cation ratios are correct, especially for iron bearing oxides.
Next is the atomic weight and average Z-bar for the sample. This is followed by the average number of ZAF iterations needed to converge each data point for the sample and the MAN iterations, which are the iterations required to converge the MAN background, interference, APF (Area Peak factor) and Time Dependent Intensity (TDI) element corrections. The Z-bar is defined as the sum of the weight fraction of the atomic numbers of all elements in the sample including specified elements.
Following this are the weight percentages and standard deviation of any elements specified for the sample by fixed concentration, difference or stoichiometry.
The specified element type is listed for each specified element. The type "SPEC" means that the element concentration is truly specified. That is, either specified by the user or (for standard samples) loaded from the standard composition database. The other types are "DIFF" for element by difference, "CALC" for element by stoichiometry to stoichiometric oxygen and "RELA" for an element by stoichiometry to another element.
The next two lines indicate the type of background correction used for each element ("MAN" for mean atomic number corrected and "OFF" for off-peak corrected elements) and the average count times for each element in the sample. Remember that the count times for each element can be different for each line in the sample. To see the actual counting time details use the Data button in the Analyze! window with the Debug Mode menu checked.
Note that if MAN background corrected sample was analyzed, the program will also indicate the magnitude of the absorption correction to the continuum background counts in the line labeled "%ABS" if the element is MAN corrected.
The analyzed element symbols are printed next and listed below are the weight percentages calculated for each data point for each element. The elemental weight results are then followed by the average, "AVER" of each element column, the standard deviation, "SDEV" and the standard error, "SERR". The standard deviation is basically the range of the results and the standard error is essentially the precision of the average. Finally the percentage relative standard deviation, "%RSD" of the results are printed which is simply the standard deviation divided by the average times 100.
Note that one can perform analyses of samples which are not unknowns, e.g. it is possible to have the program analyze a standard as though it were an unknown. Therefore, if the sample is a standard sample, the program next lists the published, "PUBL" weight percentage value for the element as entered in the default standard database. If the element is not found in the standard database it is shown as "n.a." or "not analyzed". The next line lists the per cent variance, "%VAR" from the published value for each element compared to the actual measured average for the standard. This can provide a valuable check on the quality of your analyses for secondary standard elements. That is, element channels which are not assigned as the primary calibration for that element. After this, is the line labeled "DIFF" which is a simple difference between the "AVER" and "PUBL" values.
The actual primary standard used for each element is displayed on the next line "STDS" since the standard assignments can be made on a sample by sample basis.
Note that sometimes the average analyzed value of a standard that is assigned, appears to be not exactly the value shown in the "PUBL" line. It seems reasonable that they should always be exactly the same, but remember that because PROBE treats all samples (standards and unknowns) as unknowns when performing an analysis, you may see a small discrepancy if the standard contains other analyzed elements that are also not assigned. This is because Probe for EPMA calculates the correction factors for an analysis of a sample, based on the actual analyzed composition of the sample, not the theoretical composition in the STANDARD database. Of course, for the analytical calibration, Probe for EPMA uses the database composition for the calculation of standard k-factors.
Below this are shown the average standard k-factors "STKF" or the standard beta-factors "STBE" if using alpha-factors, the average standard counts "STCT" (all intensities are in counts per second per nominal beam) followed by the normalized unknown k-ratio, "UNKF", the unknown count rate net intensity (Peak-Bkg) "UNCT", the unknown background counts "UNBG" (either MAN or off-peak measured), the ZAF correction factor, "ZCOR" if using ZAF or Phi-Rho-Z or the "UNBE", unknown beta-factor if using alpha-factors, and the average raw k-ratio, "KRAW", which is the normalized and background corrected unknown counts divided by the standard counts.
The "KRAW" value shown is corrected for all corrections except the matrix correction. These include the deadtime, count time, beam drift, MAN or off-peak background, and the quantitative interference and APF (area peak factor) corrections. The peak to background ratio for each element is shown in the "PKBG" line. A peak to background of close to 1.0 means that no peak was present.
Below this are listed the per cent correction to the counts for any interference corrections that were performed on the sample in the line labeled "%INT".
Finally, if the Time Dependent Intensity (TDI) element extrapolation was selected, the program will show the per cent change in the x-ray counts due to the Time Dependent Intensity (TDI) element extrapolation in the line labeled "VOL%" and the average deviation in the Time Dependent Intensity (TDI) fit data "DEV%". Also if any area peak factors (APF) were selected, the program will print the sum of the APFs (1.00 indicates no APF correction) for each element affected in the line labeled "APF:".
If the user elected to calculate the detection limits and/or sample statistics, the program will then print those calculations.
Following this are formula and atomic percent calculations if they were selected. All sample calculation options can be assigned to a single sample or a range of samples.