StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Statistical Optimization of Wafer Yields - Lab Report Example

Cite this document
Summary
This lab report "Statistical Optimization of Wafer Yields" discusses composing a minimized measurably nitty-gritty to give an account of the thickness of the wafer affected by speed, distance, and pressure. The report analyses decide the best condition for creating a uniform wafer…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER92.1% of users find it useful

Extract of sample "Statistical Optimization of Wafer Yields"

Statistical Optimization of Wafer Yields Name of Student Institution affiliation Date Abstract The incorporated circuits are significantly created on silicon chips which are slashed from silicon wafers. The manufacture of wafers experiences a number strides that decide the chips that will be fruitful. Through the Czochralski procedure where gems are developed, silicon wafers are delivered and inserted circuits are exchanged to the wafers by lithographic process. The accomplishment of exchange of the circuits relies on upon the control parameters set up at the manufacture plant. The quantity of fruitful chips are measured as yields and can be ascertained by use of likelihood techniques. The lab is a model of silicon wafers creation where the information acquired were utilized to ascertain the yields of the chips manufactured. Introduction The likelihood of acquiring the best Yield chips from the wafers is a component of the control techniques and measures set up. The creation procedure is thusly deliberately controlled to acquired even wafers of crystalline silicon. Crystalline silicon is most favored for two reasons: to start with, is that it is bottomless in the world's climate, being luxuriously found and as a component, coming next after oxygen. The second reason is that it is naturally agreeable. The silicon wafers contain a few number of chips that can be utilized for the manufacture of the Incorporated Circuits. The chips that are observed to be valuable in the creation of the ICs are the satisfactory yield. The quantity of effective yields rely on upon three parameters, to be specific: speed, weight and separation. These three parameters decide the thickness of the wafers and consequently the accomplishment of the yields. The control of the thickness in the plant requires that it be uniform the whole way across the wafer which is a troublesome undertaking for control specialists to accomplish. In this manner the control framework must be upgraded by enrolled factors. Taking the framework as (Y), the state factors then that decide the condition of the framework as: Speed (X1); Pressure (X2); distance (X3); By upgrading the control factors a more uniform thickness can be accomplished and henceforth create most extreme yields. In the test done underneath, measurable strategies are utilized to get the best and ideal conditions for the creation of most extreme yields. These is from trial information got. (Sullivan, 2005) Experimentation and Data The information was gathered and arranged in sheet 1 as appeared in the addendum with the thickness for the specimens researched taken at various thicknesses. For the primary test, both weight and separation were held consistent and the qualities were taken at rapid for the wafer testing. An aggregate of forty specimens were acquired and recorded in the sheet. The standard deviation of the thickness were made for the forty wafers. (W, 2013) Aims •To compose a minimized measurably nitty-gritty to give an account of thickness of the wafer affected by speed, distance and pressure •To decide the best condition for creating a uniform wafer. Results and Discussion Dataset Pressure and distance are unchanging at a low value in these 40 tests Sample number Low Speed Sample number High Speed 1 74 21 151.1 2 78.9 22 136.8 3 90.3 23 161.8 4 82.1 24 147 5 89.4 25 154.7 6 90.9 26 143.1 7 90.2 27 150.2 8 69.2 28 150.3 9 76.2 29 135.9 10 82.2 30 141.1 11 81.9 31 145.5 12 87.3 32 149.2 13 104.9 33 152.3 14 74.4 34 168.7 15 87.7 35 152.9 16 87.8 36 146.1 17 88.9 37 161.4 18 102.7 38 178.9 19 95.9 39 160.5 20 95.5 40 142.2 Question 1. Sources of variability in the data In the above data set, we have the independent data (speed) and the dependent data (thickness). This accounts for the variation of the speed. For the twenty samples that were taken in the laboratory the following chart can be used in comparison to the trends, where samples 21 – 40 represent high speed values while those of 1 to 20 represent low speed data. From the graph above it is deduced that the thickness are record-breaking high for qualities taken at fast when contrasted with those taken at low speed. An example is drawn for the sequencing of information at scopes of tests 8 to 13 for low speed and tests 29 to 34 for rapid. In both, a relentless ascent is noted with very nearly a comparative inclination then dives into a sorrow for both after the qualities were caught. It is additionally noticed that the pinnacle qualities were acquired at test 13 for low speed and test 34 for fast at qualities 104.9 and 168.7 separately. The above variety could be ascribed to various sources, which incorporates: Error due to instruments Calibration errors Reading errors Mode of measurement recording. i. Errors Due to instruments From the Czochralski wafer manufacture, the wafers are produced using crystalline silicon under controlled environment. Much of the time it is done in a vacuum to keep debasements from doping the silicon seed being developed. Nonetheless, uncleansed instruments can present remote material that would some way or another be supplanted into the silicon precious stones accordingly framing ununiformed surface of the wafer material in this way diminishing the yields. ii. Calibration errors Control instruments are inclined to shift their preset dimensions due to strain or by wear and tear. This acquaints balance blunder that need with be straightened out to such an extent that the gear record exact and exact esteem. The adjustment blunder additionally relies on upon the level of affectability of the hardware and control machines being used. iii. Reading errors These emerge mostly from parallax. It produces the greater part of transposition and interpretation blunders. Values read mistakenly are off base and see that they may reliable, they present variety and deviation of information from the standard. Therefore, the plotted qualities change from the normal. This sort of variety is not effortlessly revised but rather by astuteness and prescribed perusing practices, for example, opposite view. On account of computerized gear be that as it may, this sort of variety is missing particularly if the information is perused electronically and straightforwardly from the catch gadget. For this situation in any case with manual information passage, this blunder will undoubtedly be considered. iv. Mode of measurement recording a. Experience. This varies with the individual responsible for estimation. Hardness could bring about misread figures and values and consequently temperamental and conflicting information could be recorded. From the second specimen on rapid as appeared in the diagram above, it can be noticed that a sudden surge was noted in the recording. Such vacillations could be from flimsiness of gear or could likewise be as a consequence of hard recording and instrument set ups blunder will undoubtedly be considered. b. Profundity of the subject comprehension. This significantly influences the estimation of starting qualities to figure out if the conditions are legitimately set. Past test information demonstrate that this mistake however present is to the slightest. Consistency is a measure of value for this variety. (S., 1951) Question 2. Parametric and non-parametric test The null hypothesis is the claim that the consistency in covering is not the same for every speed. Taking the presumption that perception of the recorded information is autonomous and crude: which implies it has not been controlled, then the investigation can continue to test the appropriation bend. The appropriation of the qualities for the low speed and fast varieties can be portrayed from the radar graph plotted from the qualities which indicates the spreading over of the information. (Canover.) From the graph above, it can be noticed that for the low speed a noteworthy region traverses between the specimens 1 – 16 which favors of the way that the example surely takes after an ordinary appropriation bend. For the fast estimation the pinnacle esteem remains at test 18 which is near the found the middle value of check, and henceforth in like manner the example takes after suite. Question 3 Calculating the Significance value, β From the equation below: Method of least squares for equations with multiple independent variables are rewritten as following: Thus, in simpler term: Using the data provided from the experiment, the covariance matrix which is then used to solve for the values of β was calculated using the formula:   Where X1 = speed, X2 = pressure and X3 = distance Replacing the covariance values calculated above into the quadratic equation yields:   Solution for the simultaneous equations, yield: β = 0 Q4 Simpler derivation of Q3 A simpler method for deriving the values of 3 above is by drawing the regression line from the data provided. From the line the confidence line can be determined. However, the prerequisite is finding the covariance matrix. This is simpler since spreadsheet contains inbuilt statistical functions as described below: The elements of the covariance matrix are found from the formula: =COVARIANCE.S(Array1,Array2) Using the formular for elements 1 to 6 of the 3 x 3 matrix yields: The coefficients bo to b2 were obtained from the functions and were as shown: Variation chart shown in an xy scatter graph The chart indicates the span optimal value for the three parameters with limits on the sample between -1 and 1 References S., S. (1951). Mathematics, Measurement and psychophysics- Handbook of experimental psychology. New York: Jphn Wiley. Sullivan, L. (2005, December 07). Nonparametric Tests. Retrieved December 07, 2016, from sph.web.bumc.bu.edu. W, C. (2013). Practical Nonparametric Statistics (2nd ed.). New York: John Wiley and Sons. Read More

Results and Discussion Dataset Pressure and distance are unchanging at a low value in these 40 tests Sample number Low Speed Sample number High Speed 1 74 21 151.1 2 78.9 22 136.8 3 90.3 23 161.8 4 82.1 24 147 5 89.4 25 154.7 6 90.9 26 143.1 7 90.2 27 150.2 8 69.2 28 150.3 9 76.2 29 135.9 10 82.2 30 141.1 11 81.9 31 145.5 12 87.3 32 149.2 13 104.9 33 152.3 14 74.4 34 168.7 15 87.7 35 152.9 16 87.8 36 146.1 17 88.9 37 161.4 18 102.7 38 178.9 19 95.9 39 160.5 20 95.5 40 142.2 Question 1. Sources of variability in the data In the above data set, we have the independent data (speed) and the dependent data (thickness).

This accounts for the variation of the speed. For the twenty samples that were taken in the laboratory the following chart can be used in comparison to the trends, where samples 21 – 40 represent high speed values while those of 1 to 20 represent low speed data. From the graph above it is deduced that the thickness are record-breaking high for qualities taken at fast when contrasted with those taken at low speed. An example is drawn for the sequencing of information at scopes of tests 8 to 13 for low speed and tests 29 to 34 for rapid.

In both, a relentless ascent is noted with very nearly a comparative inclination then dives into a sorrow for both after the qualities were caught. It is additionally noticed that the pinnacle qualities were acquired at test 13 for low speed and test 34 for fast at qualities 104.9 and 168.7 separately. The above variety could be ascribed to various sources, which incorporates: Error due to instruments Calibration errors Reading errors Mode of measurement recording. i. Errors Due to instruments From the Czochralski wafer manufacture, the wafers are produced using crystalline silicon under controlled environment.

Much of the time it is done in a vacuum to keep debasements from doping the silicon seed being developed. Nonetheless, uncleansed instruments can present remote material that would some way or another be supplanted into the silicon precious stones accordingly framing ununiformed surface of the wafer material in this way diminishing the yields. ii. Calibration errors Control instruments are inclined to shift their preset dimensions due to strain or by wear and tear. This acquaints balance blunder that need with be straightened out to such an extent that the gear record exact and exact esteem.

The adjustment blunder additionally relies on upon the level of affectability of the hardware and control machines being used. iii. Reading errors These emerge mostly from parallax. It produces the greater part of transposition and interpretation blunders. Values read mistakenly are off base and see that they may reliable, they present variety and deviation of information from the standard. Therefore, the plotted qualities change from the normal. This sort of variety is not effortlessly revised but rather by astuteness and prescribed perusing practices, for example, opposite view.

On account of computerized gear be that as it may, this sort of variety is missing particularly if the information is perused electronically and straightforwardly from the catch gadget. For this situation in any case with manual information passage, this blunder will undoubtedly be considered. iv. Mode of measurement recording a. Experience. This varies with the individual responsible for estimation. Hardness could bring about misread figures and values and consequently temperamental and conflicting information could be recorded.

From the second specimen on rapid as appeared in the diagram above, it can be noticed that a sudden surge was noted in the recording. Such vacillations could be from flimsiness of gear or could likewise be as a consequence of hard recording and instrument set ups blunder will undoubtedly be considered. b. Profundity of the subject comprehension. This significantly influences the estimation of starting qualities to figure out if the conditions are legitimately set. Past test information demonstrate that this mistake however present is to the slightest.

Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(Statistical Optimization of Wafer Yields Lab Report Example | Topics and Well Written Essays - 1500 words, n.d.)
Statistical Optimization of Wafer Yields Lab Report Example | Topics and Well Written Essays - 1500 words. https://studentshare.org/mathematics/2097984-statistical-optimization-of-wafer-yields
(Statistical Optimization of Wafer Yields Lab Report Example | Topics and Well Written Essays - 1500 Words)
Statistical Optimization of Wafer Yields Lab Report Example | Topics and Well Written Essays - 1500 Words. https://studentshare.org/mathematics/2097984-statistical-optimization-of-wafer-yields.
“Statistical Optimization of Wafer Yields Lab Report Example | Topics and Well Written Essays - 1500 Words”. https://studentshare.org/mathematics/2097984-statistical-optimization-of-wafer-yields.
  • Cited: 0 times

CHECK THESE SAMPLES OF Statistical Optimization of Wafer Yields

Surveying and mapping

Survey network design plays a significant role in surveying.... There are various kinds of survey networks.... Based on the geometry of survey networking, there are open traverse, closed traverse, triangulation, trilateration and triangulatertion.... his research advances with a context consideration of this theoretical understanding....
30 Pages (7500 words) Essay

Statistical Inference and Regression

statistical Inference and Regression Name Institution Statistics is the study that involves collecting, organizing, analyzing, and interpreting data.... They also give tools for forecasting, as well as prediction using statistical models.... statistical methods are used in summarizing and describing of collected data.... The theoretic statistics are concerned with the logical arguments that underlie justification of certain approaches to statistical inference....
10 Pages (2500 words) Essay

Impact of information technology on Revenue management

The phenomena of revenue management gained importance in recent years due to the variable and discriminatory pricing schemes offered by various companies to their customers.... .... ... ... Revenue management applies the orderly analytics that predict the behavior of the consumer at the micro level and augment the prices and availability of products to the customers thus enhancing the overall revenue for the company....
8 Pages (2000 words) Research Paper

A Systematic Approach to Cost-Based Optimization in Data Mining Environment

Therefore, the present study will focus on the cost-based optimization of the queries in data mining.... Faster response to queries is the prime function of the query optimization.... The basic disadvantage is the highly coarse optimization granularity in which just one execution plan is selected for the entire data.... Important opportunities for effective query optimization are left out because of this sort of “monolithic” approach (Ramakrishnan and Gehrke, 2000)....
13 Pages (3250 words) Dissertation

Investigating the Process Parameter Optimization of High-Speed CNC Milling

High-speed/low-force machining yields less heat, reduces tool deflection, and allows machining of thinner walled work pieces.... This research work intends to optimise the parameters of CNC milling process by the application of artificial neural networks and genetic algorithms.... The parameters are optimised for improved surface finish and higher material removal rate....
8 Pages (2000 words) Research Paper

Aerodynamic Shape Optimization for a 3d Multi-Element Airfoil

A mesh morpher algorithm is used in conjunction to a direct search optimization algorithm in order to optimize the aerodynamics.... oth the mesh morpher algorithm and the direct search algorithm are gradient based optimization techniques.... When using these designs in a 3D element optimization airfoil, the control function is parameterized with some set of design variables and a suitable cost function is either minimized or maximized.... he plan of this research strategy is to produce 3D multi-element airfoil design and optimization mechanism that can adapt a model (3D airfoil) delivering enhanced aerodynamic performance in terms of maximizing lift to drag ratio under landing and takeoff flight situation (Reuther, Alonso & Jameson 1996)....
8 Pages (2000 words) Research Paper

Microcalcifications Detection in Mammograms Based on Ant Colony Optimization and Markov Random Field

From the paper "Microcalcifications Detection in Mammograms Based on Ant Colony optimization and Markov Random Field" it is clear that the importance of the segmentation phase during the analysis of x-ray breast images led us to investigate new intelligent techniques.... In this paper, we presented the results of the application of our novel technique based on an ant colony optimization for microcalcification detection in a mammogram.... Thirdly, the number of falsely identified microcalcifications is reduced by examining the statistical and textural features of the abnormalities detected....
15 Pages (3750 words) Coursework

Query Optimisation in Wireless Sensor Networks Database

This review "Query optimisation in wireless sensor networks database" is about the query optimisation is a procedure through which numerous plans for the execution of the query aimed at meeting the requirements of a query are assessed and the plan of the query that is most suitable selected for the execution....
7 Pages (1750 words) Literature review
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us