Loading
About SVC

Oral History Interview with Chih-shun (Larry) Lu (LL)
Conducted by Ric Shimshock (RS)


Chih-shun (Larry) Lu

RS: We’re here with Dr. Chih-shun (Larry) Lu, and we’d like to discuss a little bit about some of his background in the development of process monitoring for optical, semiconductor and vacuum coating applications. Larry, where did you begin your schooling for your professional background?

LL: I came to the United State in 1959 after finished my bachelor’s degree in Taiwan. I spent a year and half at Auburn, Alabama, to get my master’s degree. Then I found my first job in Syracuse, New York, and at the same time started to work on my Ph.D. degree in physics at Syracuse University.

RS: Was the degree experimental or theoretical in nature? Did you study materials?

LL: Actually, it’s a long story. To make it short, I got involved in the startup of Inficon in 1970 while still trying to find a topic for my Ph.D. dissertation. It was rather difficult to start a new company while working on your dissertation at the same time. Fortunately, I had a very understandable advisor, who also happened to be one of the investors. It ended up that I was able to use some of the physics for a product that I was developing for Inficon as my dissertation topic. The investigation was on the optical emission from certain metal vapors by electron excitation. So it was basically experimental work, but with practical applications in mind.

RS: And how did you become associated with Inficon?

LL: I started Inficon. My first job in Syracuse was with Carrier Air Conditioning Company. It happened that Carrier owned a company called Spectrol at that time. The main product of that company was trimmer potentiometer. The research division of its parent company therefore started a project on new thin film materials for resistor. That was how I got my thin film process experience from an air conditioning company. After a few years, I moved to Syracuse University Research Corporation and worked on government contracts supported by the military and NASA. In the very late of 60s, NASA had a big cutback and most contract supports were gone. So some colleagues and I decided that we should start our own business. That's how Inficon got started in Syracuse.

RS: What were some of the first products that you wanted to bring out at Inficon?

LL: When the idea of starting a business came up, of course the first question was what we were going to make. Unfortunately, at that time, the NASA project that I just finished dealt with some rocket-borne sensors for upper atmospheric studies - it was kind of esoteric and didn’t look like to have any commercial potential. On the other hand, I had accumulated several years of experience of using quartz crystal monitors in my thin film laboratory. At that time, Sloan was dominating the quartz crystal monitor market. The most popular unit had a beautiful walnut case and also produced a sound with a variable pitch during the film growth. During the later stage of using that instrument, every time I looked at that large analog meter I often said to myself that it would be nice if the meter could be replaced by a few Nixie Tubes – to make it digital.

The next thing to do was to have a marketing study. My method of marketing survey at that time was something like this. Considering myself as an average user, if I liked the digital display, perhaps half of the users of this instrument would like the feature also. We then consulted with some electronic engineers and the answer was no problem to convert the quartz crystal monitor from analog to digital. Thus the idea for the first product was born– a digital quartz crystal monitor (QCM).

RS: What was your first sale of the digital QCM?

LL: At the time while we were still busy with the startup process, another new company, Kronos, moved ahead of us by introducing the first digital quartz crystal monitor to the market. Our product was not ready yet but it was too late for us to back out. All the investors had already committed the capital. We had no choice but to go ahead to finish our own digital unit. One nice thing was that everybody liked the new Kronos instrument. So at least it proved that my idea was a good one. What we decided to do next was to put in a few extra features. One feature that we developed allowed the film density to be entered directly so the instrument could display film thickness in angstrom units. It was a unique feature and we were proud of it. However, by the time that our first product was being introduced, the popularity of Kronos digital quartz crystal monitor already captured the majority of OEM customers. Probably the only major vacuum company that had not aligned with Kronos was Airco-Temescal. We made a presentation to Hugh Smith and quickly signed an OEM agreement.

RS: How did the first unit work?

LL: The marketing and sales people at Airco-Temescal loved it. The users liked it too. But it was a tough competition against Kronos. Performance-wise there was not that much different. The direct film density entry feature was a convenient feature and we tried to emphasize on that. We even put the just introduced and very expansive LED displays instead of Nixie tubes to make the look nicer. But we really did not have that much competitive edge. We were at the unfavorable position of being a latecomer trying to capture a portion of a specialty market.

RS: What were the first usages of Inficon instrument? Was it for thin film metallizations or semiconductor processing, or…?

LL: Well, it was used for all types of thin film deposition processes. I think the quartz crystal monitors were used mainly for metallization processes at that time. Early quartz crystal monitors were not very reliable and accurate for dielectric coating processes. Thus people in the optical coating industry did not feel comfortable to use quartz crystal monitor for controlling dielectric coating processes. By today’s standard, some of the thin film coating processes at that time were rather primitive. The major contribution that I made to quartz crystal monitor technology came after we introduced the first product. The basic understanding of quartz crystal monitor, as accepted by nearly all users at that time, was that the frequency change of the sensing crystal was directly related to film thickness. Regardless of what kind of material on the crystal, the deposited mass could be converted into thickness by using the value of film density only. The properties of deposited material did not entry the equation. This was very nice and simple from the user’s point of view. However, from a physicist’s point of view, that didn’t make too much sense, especially when you learned that the sensing quartz is basically a mechanical resonator. So I started to reexamine the fundamentals of the quartz crystal microbalance. I think the reason that people were not interested in this problem earlier was due to the poor designs of sensor head. The life of sensing crystal was fairly short, so the users did not see any discrepancy during the short lifetime of crystal. But as the sensor design got improved, and the users could put more and more material into it, some discrepancies started to be noticeable. I came to the conclusion in 1971 that the mechanical property of deposited material must be taken into consideration for the quartz crystal microbalance. That means the same frequency shift for different materials indicates different amount of deposited mass. So I worked with a graduate student at Syracuse University, and we found an earlier paper that touched upon this subject. However, the analysis was rather complicated, with all parameters lumped into a cumbersome equation. We first managed to reduce the complicated formula into a rather elegant format. Then I coated a large number of crystals with different material to test the accuracy of this new formula. We found the experimental data fit the theoretical formula perfectly. Our results were first published in 1972. However, the new formula, although looked simple, contained transcendental functions and could not be implemented into an instrument with simple digital circuit at that time. By a fortunate timing, Intel just introduced its first microprocessor at about the same time. We immediately started a project to develop a microprocessor based quartz crystal monitor. We wanted to implement the new measurement formula into the instrument. And on top of that, because of the power of microprocessor, we could add many, many features to our quartz crystal monitor. So we really made a very significant breakthrough in the quartz crystal monitor market. It was just the right time to have the right ingredients to make the right instrument. In the new quartz crystal monitor, I introduced the term Z-factor for film to get a more accurate reading for the thickness measurement. We did not file a patent for that method so the formula was quickly adapted by all commercial quartz crystal monitors developed in the following years.

RS: After you had added these new features and this new capability, did you find more market acceptance? Were there different markets that you started looking at?

LL: Our new microprocessor based quartz crystal monitor was a big success. At that time the applications of physical vapor deposition processes were also expanding. Of course with wider use of the processes, the growth in quartz crystal monitor usage came along with it. One developing market was for optical coating applications. In the early days, due to the poorly designed quartz crystal sensor heads, the optical people were very unhappy about the quartz crystal monitors because—

RS: Crystal failure.

LL: Yes, untimely crystal failure. The sensing crystal might quit in the middle of a deposition process. That problem was particularly serious for films that generated high stress. Then as we improved the design of sensor head and the accuracy of the measurement formula, more and more people in the optical coating field began to trust the quartz crystal monitor for thickness control. Many people also began to use quartz crystal monitor for controlling the deposition rate, so that the index of refraction of the deposited film could be kept consistently. The optical monitor was used as an endpoint to get the right optical thickness. This type of dual monitoring system has become the standard way to control optical thin film production.

RS: The researchers at Balzers seemed to willing to take advantage of the quartz crystal monitoring process for use in their optical coaters. Did you work with them, or did they take the concept and do their own process development?

LL: Balzers worked on quartz crystal monitors long before the establishment of Inficon. As a key supplier of optical coating industry, Balzers also made its own brand of quartz crystal monitors. Dr. Hans Pulker was a strong advocate for using quartz crystal monitors in optical coating processes at Balzers. We had some exchanges in technical discussions and he contributed one chapter to a book on the applications of quartz crystal microbalance that I later edited. He was heavily involved in the development of Balzers quartz crystal monitors.

RS: Was this a digital approach, also?

LL: Of course, everybody switched to digital instrument. I think Balzers introduced its first digital quartz crystal monitor in 1975. Balzers used a larger crystal, and a slightly different electrode pattern on the crystal. But basically, the same concepts were used, including the adaptation of the Z-factor formula.

RS: How did we get—the people who utilize crystals, as we do in my industry, how did we end up using 6 megahertz as the drive frequency for the crystal? Was there some specific driver for this frequency, or—

LL: There are certain traditional criteria that relate the quartz crystal size to its frequency. If you need a very high quality factor for the crystal, then the crystal has to be larger than a certain size. But as the crystal technology advanced, these criteria changed as well. The fact is that the size of crystals for thickness monitoring application has never been optimized. In the very early days, people just used standard crystals designed for frequency control applications by removing the metal enclosure from the crystal unit. The very popular Sloan instruments in the 60’s used a large square crystal, but the crystal holding mechanism generated a lot of instability problems. When Kronos introduced the first digital monitor, a large round crystal was used. The crystal was edge-held in the sensor head and that significantly reduced the instability problems. Up to then, all commercial crystals were 5 megahertz in frequency and had both faces fully covered with electrodes. I also selected a circular crystal with the edge-held configuration for the sensor head. However, I started with a smaller 6-megahertz crystal. I also designed a special, funny looking pattern for the back electrode to make the crystal behavior in accordance to the mathematic formula. Somehow, this crystal design became, and still is, the industry standard. Why I chose 6 megahertz instead of 5? It happened that when I first started the development work on a quartz crystal sensor head, I just got a bunch of 6 megahertz crystal free samples from a vendor. It worked out fine and so I stuck with the 6 megahertz crystal. The whole thing is that the crystal frequency, and the size too, has never been carefully optimized. In the analog era, the conventional wisdom was that a higher crystal frequency offered a higher sensitivity. But when digital circuits are used to measure the frequency change of a crystal, the sensitivity is mainly determined by the speed of the internal clock. So the merits of using 6-megahertz crystal are not well established yet. Recently, people in the quartz crystal monitor business have been mainly spending their effort in improving the electronics and software. No significant progress has been made in the areas of crystal and sensor head design. This means that there may be a lot of rooms to further improve the performance of quartz crystal monitor. During the same period, major advances have been made in the development of better crystals for frequency control.

RS: Watches or timing circuits…

LL: Mostly for applications in high-speed timing circuits. However, the chief concern for those applications is quite different. The need that field is for a crystal to generate a precise frequency and to be extremely stable over a long time period. When you dealing with quartz crystal microbalance, the crystal does not need to have a highly precise frequency to begin with. In addition, the long-term stability is not that critical because the frequency always changes during its normal operation. But even though the objectives were quite different, I think the quartz crystal monitor can be benefited from many new development in crystal technology for timing applications.

RS: Two recent applications for crystal monitors are for use not only for vacuum based systems but for liquid-based systems and various biomedical process controls based on flows and viscosity. Same concept?

LL: The principle of quartz crystal microbalance in a liquid environment is quite more complicated than that in vacuum. For vacuum applications, all you need is to determine the frequency of the sensing crystal. When the crystal is immersed in a liquid, one also needs to measure the change of crystal impedance to learn the crystal-liquid interactions. Dr. Kay Kanazawa is one of the pioneers in applying quartz crystal microbalance to liquid systems. I think he is still very active in this field. Of course, the quartz crystal microbalance can also be used in an environment of any ambient pressure. So it has many applications in the fields of electrochemistry, gas analysis and immunoassay. The research activities in these areas have been very busy in recent years.

RS: Eventually you decided to leave Inficon and worked on other process-control activities. What caused you to leave Inficon and just work on these other applications?

LL: I developed more than just the quartz crystal monitor at Inficon. Another instrument for deposition process control was based on electron impact emission spectroscopy. As I mentioned earlier, I also used some of the results from this project for my Ph.D. dissertation. The end product was a continuous deposition monitor with material selectivity. I think that product, called Sentinel commercially, has gone through several generations by now. I also initiated the project of making a low-cost RGA with a graphic user interface. No drastically new technology was put into the sensor, only taking the advantages of the rapidly growing capabilities of microprocessor at that time. That product line became a major success for the company. However, after many years of living in the upper New York State, I got sick and tired of long winters and snowy weather. As the company grew, I was also got disenchanted with one of the co-founders about his management style. So I moved here to Silicon Valley and set up a western operation office for Inficon. Less than a year later, Inficon was bought out by Leybold-Heraeus. No management change was made back in Syracuse so I made the decision to leave Inficon in 1977.

RS: And stay in sunny California…did you start another company, or…

LL: Yes, after I left Leybold-Infocon I became a consultant for a couple of years. Then I started another company Xinix in 1980. That company was merged to Luxtron in 1988. I took a little break and then started Intelligent Sensor Technology (IST) in 1992. That company was bought out by Luxtron again in 2001. At Xinix, we developed a line of optical emission monitors -

RS: Did you find the large volume application you were looking for?

LL: The customers that used optical emission monitor at that time were mostly in the semiconductor industry. Dry etching became an important step in the IC manufacturing process. The optical emission spectroscopy was used to detect the etching endpoint as well as for plasma diagnostics.

RS: Another optical process control device is embodied in your ATOMICAS instrument.

LL: Well, ATOMICAS was the commercial name for an instrument that I developed at IST. ATOMICAS stands for atomic absorption spectroscopy. IBM was the first one to apply this technique for measuring the deposition rate way back in the early 70s. I made a laboratory unit and played with it for a while at that time. The conclusion was that for applications in high vacuum environments, the Sentinel was a more convenient product to do the job. It was then put on the shelf, so to speak. Then in the late 80s, everybody was gotten excited by the discovery of high temperature superconductors and tried all kind of recipes for making better superconductor films. Because these films consisted of multiple components and their preparation required high reactive gas pressure, the existing deposition monitors were inadequate for controlling the deposition processes. I then realized that the atomic absorption method could be used to solve a number of problems for the preparation of superconductor films. I quickly made a few instruments and worked with a group at Stanford University. They successfully made some high quality superconductor films using a co-evaporation technique controlled by atomic absorption monitors. We jointly published a number of papers. People then recognized the potential applications of atomic absorption monitor in other co-deposition processes, certain reactive processes, and in continuous in-line systems, just to name a few. That was how the interest in atomic absorption monitor got rejuvenated. I first got a Phase-One SBIR contract from DARPA to develop an improved version of atomic absorption monitor. That ended up as the commercial ATOMICAS instrument. Later we got another sizable contract from DARPA to allow us hiring a few PhD’s just to explore new applications of ATOMICAS.

RS: So, Larry, atomic absorption spectroscopy is based on an optical beam keeping track of an individual vapor species, atomic flux, in a vapor deposition process. I know the device is used extensively in the many R&D projects, as you have said, but are there specific areas that you are seeing more utilization of this instrument in production areas now?

LL: It has been used for controlling a variety of physical deposition processes. Its material selectivity has been particularly useful in controlling the film composition during multi-element, co-deposition processes. One example is for MBE growth of III-V compounds. It also found applications in the manufacturing of photovoltaic films using the so-called CIS process. This process requires that the deposition rates of individual element – copper, indium, and sometimes gallium, to be precisely controlled.

RS: Now with the ATOMICAS, are we moving that back into the realm of real-time process control? As an instrument supplier, is this along the lines of providing process engineers a tool and then they optimize their own proprietary processes using this tool, as opposed to, let’s say, the situation with an endpoint detector? Do you have to really know specific process parameters with some of these sensors?

LL: Absolutely. In the case of process endpoint detection, life is somewhat easier. It is more or less tracking a single set point. However, most of the time the process endpoint is not represented just as a particular signal level from the monitoring sensor. So you have to set certain rules to the behavior of sensor output to define the endpoint. You must have detailed knowledge on the process to come up with the right algorithms. Otherwise the instrument will miss the endpoint or mistakenly identity it. For real-time control, as the atomic absorption monitor is designed for, things are even for complicated. A control signal is fed to the source to keep the deposition rate at a constant. To keep the control-loop stable, the characteristics of the deposition source and the overall system must be known. To make the film composition consistent, the readings from the monitor must be in absolute values. Very few sensors can give absolute readings without first going through some kind of calibration procedures. ATOMICAS is no exception. So the question of how to calibrate and how often it needs to be calibrated becomes an important issue. In order to develop practical calibration procedures for the users to follow, you need the knowledge of the process, and the deposition system.

RS: Is the location of the sensor in the process space as important in the ATOMICAS as some of the other applications?

LL: Yes. The beauty of atomic absorption monitor is that you are using an optical beam to probe the vapor flux. So it can be totally non-intrusive. The light source and the detector can be placed outside the chamber. But the problem is, naturally, the chamber needs a pair of windows to allow the optical beam to go through. This is a typical interface problem. If a potential user already has a chamber, but without the viewports at the right locations, it would be rather difficult to install an ATOMICAS unit into the system. On the other hand, if the system is still in the design stage, it will be relatively easy to put the ports at the right places. Of course, the locations of ports affect the performance of the monitoring system. Placing the probing optical beam close to the source may be preferred in certain applications as against of placing it close to the substrate. But the situation can be reversed in other applications. So the sensor-system interfacing is a very critical issue on how to make the best out of a process monitoring instrument. Most people found it out too late – there is no place in an existing system where you can place a sensor at an optimal position. My advice has always been – think about how the monitoring sensors will be placed at the earliest stage of designing the chamber and internal fixtures.

RS: So, as you’ve seen many of these different processes evolve over time—plasma etch, wet chemistry, deposition—where do you think the future’s going with process control system? Is it more stability, is it higher rates? What are people going to be looking for?

LL: From the inputs that I got from a large number of users, the range of deposition rate needs to be controlled varies all over the map. Many people working in the field of MBE called for controlling rate at hundredths of an angstrom per second level. These processes take a few minutes just to grow a monolayer. On the other extreme, in some industrial coating processes the deposition rate can go as high as few thousands angstroms per second. It is unlikely that any single monitoring instrument can cover such board range of deposition rate, and even more less likely, for all materials. As for the process ambient, if you want to grow a monolayer slowly, say, as long as several minutes, an ultra-high-vacuum environment becomes a likely choice. But many processes, such as sputtering and CVD favored by industry for their high throughput, operate in a low vacuum environment. Again, many monitoring techniques can only be used in certain environments. Any sensor that uses a thermionic emitter does not work well or last long in an environment with high piratical pressure of reactive gases. One nice thing about the atomic absorption spectroscopy is that it is applicable to an environment of any pressure. As for the trend of new process development, I see the semiconductor manufacturing related processes are moving away from ultra-high-vacuum environment. However, regardless of what level of ambient pressure involved in the process, the cleanness will always be the most critical requirement for any system.

RS: Do you think that the availability of high-performance and cheap computer power gives any advantage to some of these process-control techniques?

LL: Oh, yes. The power of computers is definitely changing the nature of the instrumentation. Before, we typically had a box with all these **** and displays on its front. Inside the box, the electronics were always dedicated to the sensor. Now a day, an instrument can be a just a plug-in board for the PC. In my view, the most critical part of a process monitoring instrument is the sensor. It must be robust and can live up in a hostile process environment. It can measure the process parameter that you are interested, and do it accurately and reproducibly. If you have a clean signal coming out from the sensor, any low-cost computer system available today can do a good job to covert the signal into a useful format. Otherwise, it’s garbage in and garbage out. The computer simply cannot be used to fix all the imperfections in the sensor signal. So after the sensor, the other critical part of an instrument is basically a black box that contains the sensor interface and drive electronics. All signal processing is done either by an imbedded or bus-connected computer. A graphic user interface is used to indicate the status of this instrument, or the whole process system. If the physical principle of the sensor is sound and the sensor is well constructed for long operating life, it will be possible to continuously improve the instrument capability by upgrading the software only. I believe this will slow the obsolescence of many instruments.

RS: Do you think there will be a drive to develop hybrid or combined sensors, where you utilize multiple sensors let’s say, an optical emission monitor along with a quartz crystal monitor to provide robust knowledge of a species and a mass deposition rate.

LL: Yes, I definitely think so. In order to control a process, you cannot just monitor one parameter. Again, let’s take the example of a multi-layer optical coating process. You need to control the optical thickness as well as the index of refraction of each layer. The index of refraction is difficult to determine and control in real-time so you control it indirectly by keeping the deposition rate and some other process parameters constant. An optical monitor is fine to give you the optical thickness but it is not easy for the purpose of controlling the deposition rate. So you use a quartz crystal monitor to control the rate. This type of dual-monitor system has been well accepted in the optical coating industry. In theory, one can even use the quartz crystal monitor alone to control both parameters. However, in order to keep a direct relationship between the physical thickness and the optical thickness of a film, some other process parameters need to be controlled more tightly. In any case, your need is a process controlling system with multiple inputs. Many people overlook the fact that the ionization gauge for total pressure indication is a very important input in the overall process control system. Most operators rarely question the long-term accuracy of such gauge while trying hard to get other process monitors precisely calibrated. Obviously, each type of process monitor has its own limitations. If one put two different but complementary monitors together, sometimes the combined instrument can offer many new capabilities. For instance, atomic absorption monitor is great for long-term operation, and it is material selective. But the signal coming out from an atomic absorption monitor depends on many external factors and must be calibrated periodically. For quartz crystal monitor, the readings are in absolute values. No calibration is required. But the sensing crystal of a quartz crystal monitor has a limited life, and it cannot distinguish what kind of material is being deposited onto it. So if you put the two together, you can use a shuttered quartz crystal for occasional calibration purpose only and let the atomic absorption monitor to do the rate controlling continuously. Now you have an accurate and continuously running deposition monitor suitable for many in-line systems. Of course, there may have some technical problems involved in making multi-sensor monitors, but I don’t think those are insurmountable. With the “black box” concept for the front end that I envisioned, the cost of a multi-sensor deposition process controller can be brought down significantly. The progress in this direction, as I see it, is mainly limited by the availability of useful sensors. In comparison to the recent advances in electronics and software, the sensor development has been moving at an alarmingly slow pace.

RS: Larry, you also have worked on some optical monitors that monitor the optical reflection of a signal off a substrate. Were those just individual instruments, or did you actually think of product lines around some of these monitor systems?

LL: I did develop quite few optical monitors of different kinds, but mostly for special requirements by the customers. The physical principles involved were generally simple. The problems were always the interface between the sensor and the deposition system. Optical methods are nice if the optical paths are short and nothing moves around. Obviously, these conditions do not exist in a typical deposition system. One of the jobs that I got many years ago was to monitor the transmission of film in a very large roll-coater at discrete wavelengths from near UV to near IR. It sounded like a simple task but there was simply no off-the-shelf instrument, or even building blocks, available at that time. I had to test and select most of the key components first, and then put all these things together to came up with a solution to satisfy the customer’s requirements. Another project involved the in-situ measurement of spectral reflectance of optical disks. Optically speaking, it appeared rather easy. All you need was a broadband light source and a spectrometer. The problems were that the disks were moving wobbly and rapidly in the in-line system. Again, a significant amount of effort was spent on solving the sensor-to-system interface problems. In both cases, the instruments so developed did not appear to have a large market because they were designed for specially configured systems to make special products. The volume was not there.

RS: Larry, one of the other drivers that I’ve seen in the industry is not only upgrading older process chambers, but that the customers are looking for a turnkey operation and therefore you have the vapor source and process control integrated really into the system. Does that mean that it would be more difficult for an entrepreneur to follow the type of journey you’ve been on, where you supply a lot of process control tools which have become widely accepted across many different industries? That is, is it be necessary these days to develop a company that supplies a whole system, or do you think it is possible to still develop stand-alone sensors to be sold into various production lines?

LL: I think in the field of in-situ monitors, it will be more and more difficult to develop the stand-alone instrument and be successful, such as in the case of quartz crystal monitor. With the process and system getting more sophisticated every passing day, the interdependence between these two and the process monitors gets tighter and tighter. The in-situ monitor can no longer be considered as an optional add-on but only as a must-have, build-in component of the system. Another problem is regardless how much improvement a good process monitoring system can add to the product yield and throughput, most equipment manufacturers are only willing to allocate a small fraction of the total system cost to the controlling instruments. But to successfully develop and support an in-situ monitor, a significant amount of resource needs to be invested first for gaining the knowledge of these specialized processes and equipment, and then to keep up the pace of rapid technical development in these areas. A small company may not be able to find the talents or afford a strong product supporting staff. So, even with the demands for in-situ process controllers everywhere, the growth potential for an independent company that makes such instruments only is not going to be huge. In this business, it’s unlikely that you can sell thousands of instruments right off the shelf. Many users may have unique requirements and need special solutions from the instrument supplier. For me personally, I always enjoyed to solve problems for the customers, especially to see some of my ideas being developed into some useful hardware. In the early days when I was developing the quartz crystal monitor, I could make some changes to the crystal or sensor head and then got the results almost immediately by running a deposition process by myself. With the availability of right equipment, the direct hands-on involvement and the good understanding of both process and sensor physics made the product development time very short. That was particularly satisfying to me. Unfortunately, as the equipment and process getting more complex and expensive, it is no longer easy to find such type of opportunity. When I developed the endpoint detector for dry etching processes at Xinix, I never had a plasma etching system in our laboratory.

RS: How many quartz crystal monitors do you think there are deployed in the world at this point?

LL: I wish I could answer that question. Because most of the companies are private and they don’t give away any market information, I don’t think anyone has an exact number. The lifetimes of these instruments also seem pretty long. I still see some twenty to thirty years old quartz crystal monitors that are still being used today. It is also like the razor blade business, a few company are doing very good business by just making and selling the replacement crystals. So from these indications, I think there must be a lot of quartz crystal monitors in use all over the world now, probably in the thousands. An interesting note was that in the mid-70’s, I got a marketing report predicting that the market for quartz crystal monitors would disappear within a few years.

RS: And yet thirty years later and they’re still selling a lot of crystals into the industry.

LL: The reason is that quartz crystal monitor has certain basic advantages. The most important feature is that its measurement is absolute. Another thing is that quartz crystal sensors do not have serious system interface problems. All you need is a hole for a feedthrough to get the sensor installed.

RS: Well, Larry I thank you for the time you’ve spent. Are there any words of wisdom you want to pass onto some of the other researchers out there and practitioners in the vacuum coating industry about instrumentation?

LL: Well, let’s put it this way. I often told people, perhaps half jokingly, that if I could choose my career path again, I would probably choose the development of ex-situ measurement instruments instead. For ex-situ measurements, you can place the sample in a controlled environment, manipulate it, and take a sufficiently long time to get the data. The sample is totally under your control and the measurement will be highly reliable as long as you use the instrument properly. With a sophisticated technology, add bells and whistles, the user is willing to pay a premium for it. In contrast, an in-situ sensor is typically located in the process chamber for a process that you don’t necessarily know all the details. The environment is typically very hostile - high temperature, electrical discharges, mechanical vibration, and things like that. This process environment cannot be disturbed by the senor to any significant degree. In this aspect, you are not in the driving seat. The requirement for real-time feedback control means that the data acquisition and processing time has to be relatively short. For the sake of robustness and ease in maintenance, the senor has to be relatively simple and thus ruled out many sophisticated technologies. These are the basic challenges that one faces in the development of a now process monitor. So from the technical point of view, there are many opportunities in the in-situ monitoring area waiting for you to explore. However, to creating a successful business of in-situ monitoring instruments, one will face many challenges with no easy solutions.

RS: Where do think the push for nanotechnologies will drive sensor market? How will we build from the atom level?

LL: Well, I think every process needs some kinds of in-situ monitoring for feedback control, particularly in automated manufacturing processes. In processes that involve vapor deposition, one needs to monitor and control vapor phase parameters, and preferably also the solid phase parameters on the end product. However, in the recent nano-techniques, they are picking up atoms one by one and moving each from one place to another place. In that case, there is really no vapor phase material at all, but you will need some kind of visualization method to following the atom movement. This is still in-situ monitoring to allow feedback control. Nano-technology is an exciting field now, I think it will provide additional opportunities for in-situ monitoring instruments.

RS: Well, on behalf of the many process control engineers and scientists thank you once again for all your work in helping gain control of our various vacuum processes.

 

 



Contact Us | Member Login  | Use and Privacy Policy | Forum Terms of Use
© Copyright 2006-2016, Society of Vacuum Coaters (SVC™)
All Rights Reserved

Follow SVC on Twitter
Society of Vacuum Coaters
9639 Kinsman Road
Materials Park, OH 44073-0002
Phone 505/856-7188
svcinfo@svc.org