Dr. Andrew Kahng Pursuing excellence in DFM ...
******************************** Editor's Note: Andrew Kahng is Founder and CTO of DFM start-up Blaze DFM. He is on leave from his faculty position at UCSD. An edited version of this article was published in March 2004 in in Play in EDA. ******************************** If you have ever been to the campus of the University of California at San Diego, you know that the place is one of great serenity – poised on towering bluffs overlooking one of the most beautiful vistas anywhere along the endless coastline of the mighty Pacific. UCSD has an ambience all its own. This is a place where the ocean breezes meet the wistful beauty of the Torrey pines on the rolling hills above jewel-like La Jolla. It is a place where students come to enjoy the most sincere of college experiences amidst compelling architecture and intimate college clusters, to be inspired by dedicated faculty and some of the most sophisticated research opportunities in the world. UC San Diego has been growing quietly and steadily since its inauguration in 1965, and has accumulated accolades and honors along the way for the quality, depth, and breadth of its programs across all disciplines. It is a place that does not focus on competition with its sister campuses in Los Angeles or Berkeley – or any other campus of the huge and diverse University of California system. Rather, there is a quiet feeling of unbounded promise here, a sense of destiny as one of the great research institutions in the world. UCSD is also a place that now boasts one of the true stars in EDA, Dr. Andrew Kahng. Kahng is an incredible individual to interview. He pauses before answering any question, taking his time to respond. It is patently clear that when he does respond, his answer has been placed within a larger context – one that is orderly and rational. Despite his many honors and accomplishments, Kahng maintains an air of dignified humility which makes one take all the more notice of the thoughtful responses he provides to questions laid out before him. Andrew Kahng spent nearly 12 years teaching at UCLA before returning to San Diego, California, in 2001. Kahng was born in San Diego, did his graduate work at UCSD, and is very happy to be back home, teaching and leading research in the areas where he sees opportunities to make a contribution. Along with his academic accomplishments, and they are many, Kahng has spent many years lending his energy and ideas to making the annual Design Automation Conference a success. This year, he is co-chair of the 73-person technical committee, responsible for the Design Tools portion of the program. Limor Fix from Intel is the co-chair in charge of Design Methodology aspects of the program. It was an honor to have an opportunity to speak at length recently with Professor Kahng. Q: What is the outlook for DAC this year? This year, DAC had a record number of technical paper submissions – nearly 800 – across the three major areas of design tools, design methodology, and embedded systems. Since the number of sessions has remained constant, selection was very rigorous and the technical program is extremely strong as a result. We saw obvious growth in the interest level for such areas as low power – especially leakage – analog and mixed-signal design, reconfigurable devices, and the area of most notable growth, Design for Manufacturing. We have also put together a number of special sessions and panels that offer exceptional value and interest to the conference attendees, with topics including reliability, platform-based SoCs, performance closure, and the future of ASICs. It is really a stellar program. Q: How did you get involved with DAC in the first place? I have been on the DAC committee since 1995. I’m not really sure how this came about, but someone once said that in the early 1990s I had a reputation for extreme diligence as a reviewer, and the DAC committee noticed that here was a person who sent in a 3-to-5-page review for each of the 10 papers I examined, year after year. As a result, I was asked to join the subcommittee for physical design, and I worked on this for 6 or 7 years, along with other topic areas such as fundamental algorithms, platforms, manufacturability and yield. I also served on the Panels subcommittee for a number of years. Eventually, I timed out on these responsibilities, but the DAC Executive Committee asked me to return as co-chair of the Technical Committee this year. Q: What about the persistent complaint that favoritism or undue influence causes some papers to be selected over others? DAC is very, very conscious of perceptions in the community it serves. The conference goes to great lengths to avoid any fact or perception of bias. DAC is the only conference that maintains a double layer of anonymity, with anonymous submissions and anonymous external reviewers. All conflicts of interest are flagged during the review process, and committee members are not allowed any influence in any situation where they might have a conflict – they cannot even see the reviews. Having said this, please understand that there are over 80 of the world’s most active EDA technologists on the Technical and Panels committees who are selected for their vision and judgment, and a limited program of 56 sessions, 163 papers, and 8 panels. So, there will always be some complaints of the type you mention. More generally, every aspect of the program is reviewed for balance of interests and commitments. Look at the composition of the panels over the years – organizers versus panelists versus moderators, big companies versus small companies, customers versus vendors. Look at the care with which overt marketing or log-rolling is prevented through review of slide content, logo size, business relationships, and so on. There are many rules and processes in place to ensure that irregularities are spotted. Of course, there is a tendency to invite customers, investors, or friends of friends into panels or special sessions, but I cannot recall any instance where DAC was unaware of this happening. Negatives, such as perception, are consciously weighed against positives, such as the value that a participant brings to a session. And it is helpful to realize that there is a trade-off between eliciting strong contribution and participation from companies, and restricting their ability to influence outcomes. On balance, DAC tries very hard to distribute participation equitably among the major players and constituencies in the EDA community. When there are complaints, they are always looked at extremely seriously, and practices are amended to avoid any recurrence. Q: Tell me about your background. I received my Ph.D. in 1989 at San Diego – I’m a native San Diegan – then joined the UCLA faculty. I enjoyed my career there, became established in CAD and physical design, advanced through the faculty ranks – but simply couldn’t turn down the opportunity to join UCSD, where I have a joint appointment in both the CSE and ECE Departments. Q: Hold on. Why are UCSD’s departments organized this way, with the "C" and "E" common to both? Traditional electrical engineering spans everything from circuits and VLSI, to communication theory to applied physics. And while it is a younger field, computer science has rapidly broadened to include such areas as human-computer interaction, machine vision, bioinformatics, and information technology. I think any "EECS" department will inevitably establish some type of hierarchy, if only to handle fairly issues such as recruitment and promotion. What I mean by this is that someone with an applied physics background, say, could be unwilling or simply unable to evaluate another person who specializes in artificial intelligence – whether for recruitment or promotion. So, in the late 1980s, San Diego split its EECS Department into separate EE and CS entities. By contrast, Berkeley has a single EECS Department, but the EE and CS divisions keep independent counsel on key matters. Alternatively, Michigan split the two departments, and then re-merged them into a single department. But your question was specifically about the "C" and the "E". With semiconductors, design, heterogeneous system integration, and embeddedness at the forefront of value creation and social change, there is a need for integrated skills across embedded application software, middleware and architecture, networking, VLSI design, and system implementation. These are all aspects of the computer engineering field, and so the computer engineer spans both traditional EE and traditional CS disciplines. To address this need, departments can merge, or they can leave computer engineering entirely in EE or in CS – usually the EE side. Alternatively, CE can be shared between the two departments – a choice that increases flexibility as well as potential friction. Each model has worked well; it all depends on the collegiality and the quality of the faculty who commit to making it work. Q: Are you happy with the move to UCSD? My family and I were delighted to return to San Diego in January 2001. This is a great city for raising a family – we have very young children – but the overriding factor was UCSD’s uniquely strong upward trajectory. There is a confluence of many positive factors, and one might say the phrase "perfect storm" applies here. UCSD’s future leverages the population growth in California, which means increased enrollment and faculty size at those UC campuses that have room to expand. UCSD is the single dominating research university, and enjoys tremendous community support, within this, the seventh-largest metropolitan area within the United States. There are one-of-a-kind centers of technical excellence in the region: the Salk Institute for molecular genetics research, a strong biotech industry, the Scripps Institute of Oceanography, the incredible hotbed of wireless telecomm that grew out of Linkabit and Qualcomm. Nokia, Sony, STMicroelectronics, Intel, IBM, etc. all have a local R&D presence. Broadcom and Conexant are just an hour’s drive away. All in all, I feel that UCSD is in an excellent position with respect to recruiting of outstanding faculty and outstanding graduate students. A lot of people want to come here, and personally I’m grateful to be here. I feel as if I’ve gotten in on the ground floor of something truly wonderful. Q: Tell me about your research focus. I come from a background of physical design, along with performance analysis, physical verification, and chip implementation methodology. I’ve founded or chaired such conferences as ISPD, SLIP, EDP – and was a visiting scientist at Cadence from 1995 to1997. I’ve also been part of the ITRS roadmap effort since 1996. These days, my research group focuses on the interface between design and manufacturing. For the past five years, in my role as chair of the ITRS roadmap committee for design technology, I’ve been keying on the message that "cost of design is the greatest threat to the future of the semiconductor roadmap." I’ve had a long-standing concern about the level of investment in design technology vis-a-vis solving the "red brick" challenges in the ITRS road map. "Red bricks" are the technology requirements that are needed to keep Moore’s Law going, but that have no known solution. Examples include reducing interconnect capacitance and noise, controlling device leakage, and reducing manufacturing variability. Today, such challenges are thrown onto the process, materials, or equipment engineer, and billions of dollars are spent trying to advance low-k materials or copper interconnects, next-generation lithography, et cetera. Ultimately, it is critical for the semiconductor industry to understand that "equivalent scaling" advances can be obtained by investing in improved design technology. The value proposition of EDA is that we can help keep the semiconductor industry on the Moore’s-Law cost trajectory, through pure software solutions. Statistical timing and performance optimization to directly improve parametric yield and cut away corner-based guardbanding; layout that is amenable to minimum-cost reticle enhancement; reticle enhancement that is amenable to fracturing and mask writing or inspection – there is stunning potential in this space when you look at the whole list of possibilities. Efforts in these areas could save millions of dollars in design and manufacturing NRE, and close a lot of open loops in today’s design-through-mask methodology. So, design for manufacturing is of intense interest to me and the primary focus of our research group. Another way of describing what we do is: Our goal is to maximize the dollars that end users can extract from each wafer. We hope this research can make the entire industry aware of new ways to reduce cost and risk in semiconductor product development. Otherwise, we’re in a death spiral. Q: How long have you been working on DFM? As Program Chair of the Sixth Physical Design Workshop in 1996, I put together an invited session on DFM. For the first International Symposium on Physical Design in 1997, we invited Dr. P. K. Vasudev of SEMATECH, who was literally shouting into the EDA wilderness about strange things like OPC and phase-shifting masks. I took some trips to learn more from P.K., and from Dr. Frank Schellenberg, who’s now at Mentor Graphics. So, in 1997 my group made the jump, and by 1998 we were publishing on topics such as dummy fill and phase-shifting mask layout. I’ve also participated in some efforts to build bridges across multiple communities: EDA, designer, mask, and process – since the ability to even converse across disciplines has always been the key to solving DFM. There are a lot of valuable cross-industry forums such as Advanced Reticle Symposium, X Initiative, Open-Kit, and SPIE/BACUS meetings. To complement these, I helped organize special sessions on subwavelength lithography and RET at both ISPD and DAC in 1999, and again in 2001. And I was part of the committee that started the Design-Process Integration conference at SPIE. On my webpage, you can find tutorial presentations from DAC, ISPD, ICCAD, ISQED, ASPDAC – and some other invited talks. I’m always excited about opportunities to evangelize this technology direction. It’s been a 7-year effort so far for our group. Q: In your answer a few minutes ago, what did you mean by "death spiral"? Well, this is the "Dark Future" I’ve been describing in talks for about the past five years. As design cost, turnaround time, risk, etc. continue to grow, we see more workarounds such as platform SoCs, reprogrammability, or even pure software-based value-add and product differentiation. A well-discussed symptom is the declining number of ASIC starts and tapeouts. This isn’t working around the "design productivity gap" or some mythical failure of EDA. In the limit, it’s working around the increasing cost of retooling – and amortization of that retooling. There’s a vicious cycle of poor value delivered, poor business valuation and ROI, and poor levels of R&D investment. I’d like to see a virtuous cycle of improved value to the customer, improved valuation, and greater R&D investment. I think DFM is both a lifeline and a lifetime opportunity for the EDA suppliers to show their value to the semiconductor sector. Q: Speaking of bridges between communities, how do you think the Numerical Technologies purchase by Synopsys is working out? Hmm. I was on Numeritech’s technical advisory board – and Cadabra’s before that. Numeritech had a very strong collection of technical components and intellectual property – they had Cadabra’s library synthesis capability, and basic patents in alternating PSM. The issue is that it’s always difficult for lithography and optical modeling and simulation, and printability analyses, to make a bridge back up to design creation and optimization. AIM tools by themselves don’t tell designers how to close timing or improve parametric yield, how to architect a better cell library, or which design rules can be pushed. The SPIE crowd, coming from lithography, sometimes doesn’t understand how to address chip-level metrics, or develop glue infrastructure such as parasitic extraction and noise analysis. So, a merger of Numeritech with some EDA giant seemed inevitable. I remember thinking at the time that there was a good match with the technology management style at Synopsys. Another point to consider – given Numeritech’s focus on phase shifting – is whether there are cost-competitive alternatives to strong phase shifting. We’re seeing off-axis illumination, CPL, a number of possibilities that could yet again slow adoption of PSM, especially any flavor that requires doubling of mask steps. A key development was the announcement last year, around the DAC-2003 time frame, that Intel is skipping 157-nanometer exposure tools, and pushing all the way down to the 45-nanometer node using 193-nanometer immersion lithography. As far as I’m concerned, that was the biggest news in EDA for all of 2003. It was an incredible boost for the importance of EDA to the semiconductor industry, and in my view it increased the potential value of phase-shifting. Going the other way, some IDMs have already said that 65 nanometers will have strongly restricted design rules, trading layout designer’s freedom to regain process window and parametric yield. This may reduce the need for strong phase-shifting. I agree that solutions to the "million-dollar mask set" will have to include design rule restriction, and so we need to develop and present to designers a rational and principled way to assess the tradeoffs between design freedom and manufacturability. You could think of structured ASICs as a manifestation of the same trend, because with those types of devices, the critical layers have already been printed or are known to be printable. Q: What do you think about structured ASICs? There’s certainly a lot of buzz these days, because with so many flavors, they are offering more distinct tradeoff points along some Pareto surface of NRE, turnaround time, and power-speed-density. We’ve seen the ASICs-versus-FPGAs controversy at DAC for many years, and this has now spread to VPGA’s and reconfigurable SoC’s. Certainly there’s a space where structured ASICs are extremely competitive, but when you push along any one of the volume, performance, or integration axes – really push to the edge of the technology – you’ll see that the design can’t afford the tradeoffs associated with executing in an FPGA or a structured ASIC. You’ll have to go with traditional solutions – and that’s where my group’s research comes in. We’re pushing down the cost side of traditional implementation, trying to stop the scaling of design and manufacturing NRE’s. For continued vitality of the industry, and continued access of new applications to silicon implementation, this is critical. Q: Would you say your group’s focus on cost will lead to new tools, or to new methodology? That’s a great question. The answer is, "Both." We’re doing point tools research to address cost reduction on every level we can think of – RET insertion, leakage power variability, guardbanding methodology, characterization of variability, layout fracturing for VSB mask write, multi-project reticle and wafer design, wafer shot map optimization, dicing plan optimization – you name it, we’re trying to look at it. This list might sound long, and my research group is not large, but I have some great Ph.D. students and some great working collaborations, notably with Dennis Sylvester and Igor Markov at Michigan, and with the MARCO and SRC member companies that support or mentor our research. And there is great and fast-moving research at other locations, on leakage and thermal issues, advancements in library modeling, statistical analysis methods, and so forth. All of these EDA tool capabilities will be part of solving the ITRS red brick challenges. But, we’re also concerned with methodology, which buys a lot, and should be much more highly valued. If I can reduce a twelve-month design turnaround time by 40 percent, I’ve saved 20 weeks, which is the equivalent of 20 percent of a technology node. Moore’s Law is roughly "one percent per week" – and my view is that we need to take those percents any way we can, whether in days, or megahertz, or milliwatts. Dr. Stefanus Mantik from my group developed an initiative and an entire open-source infrastructure for measuring and improving the design process – the "METRICS" initiative, which we started back in 1998. Right now, a couple of start-ups are trying to take this type of idea into the marketplace. By putting a layer of optimization of top of existing commodity tools in the RTL to GDSII implementation space, we can maximize ROI on a customer’s EDA investment by using tools as optimally as possible. You need understanding of the QOR sweet spots, understanding of distributed compute platforms, understanding of enterprise-wide process management, understanding of tools and QOR sensitivity to design parameters – and then you can start to pull all that’s possible out of existing technology. Q: DFM has not exactly been a barn-burning sector in the EDA industry. Do you see a light at the end of the DFM tunnel? The truth is that I’ve been giving my vision talk, in one form or another, since 1999. I usually present a To Do list at the end of the talk, and explain that many of the items aren’t just low-hanging fruit, they’re fruit that’s actually sitting on the ground waiting to be picked up and sold. So yes, I am worried about who is actually going to do this right. The EDA industry structure is harmful. Internal R&D structures, which are usually along classic partitions of physical verification versus custom layout versus DSM analysis versus batch P&R, are also harmful. It’s not obvious that this context will permit the best solutions to be developed as efficiently as possible. Evolution of existing tool offerings and mindsets may take more time than we have. Having said that, I’m an optimist at heart. Our work here in San Diego is deeply engaged along all of the trajectories I’ve mentioned, plus quite a few others I haven’t mentioned. Very bright young graduate students have come to UCSD to work in the DFM area, and I’m confident that we’ll contribute to future DFM solutions. Q: Where do your students come from? That’s another interesting question. My last few Ph.D. graduates and my current Ph.D. students come from India, China, Korea, Ukraine, Egypt and Indonesia. These students are uniformly the best of the best – extremely well-trained and talented. EDA is a concern to industry and academia all over the world, and on one hand we do good EDA research in the U.S.; on the other hand, these students typically have their pick among the top programs. I should say that there are two shadows in this picture, one transient, and the other structural. The first shadow, which has been reported in the New York Times and other media, is that because of the current visa climate, applications from abroad declined dramatically this year. And current students can’t leave the country to give talks or to visit their families, because they have so much trouble returning to the U.S. The second issue is really just a fact of life, which is that U.S. universities have been cycling through overseas sources of brains, and we’re going to eventually run out of these sources. Q: Can you explain that? Well, twenty years ago, engineering schools in the U.S. had almost all the top graduates from Taiwan, for example. Then India opened up: nearly entire batches of IIT graduates in computer science would come to the States. And then the People’s Republic of China. More recently, it’s been the former Soviet republics – Ukraine, Russia, Belarus. And, too, Romania and Yugoslavia. These waves begin with a trickle, perhaps after normalization of international relations, and then increase as a network of former schoolmates and alumni is formed. But these waves also end when enough well-trained students go back home and bring their expertise – along with any prestige that may come with their American degrees – to their domestic universities. After enough MIT or Stanford or Berkeley Ph.D.s become professors in their home countries, programs that already enjoyed top-notch technical excellence can definitely retain the best and brightest students in that country. And that’s the end of the wave: Fewer students opt to come to the U.S. for their graduate work. My point is just that we’re running out of waves. As a young faculty member at UCLA, I saw the end of the Taiwan wave. I never saw any significant number of students from Japan, Germany, France, or the U.K., because the top universities in those countries are better for purposes of career advancement for those students. I sense we’re now at or past the peak of the China wave, and perhaps the India wave as well. Currently, we’re riding the Eastern Bloc wave. Q: Last question: What would you tell a graduate student who is considering San Diego? I’d say that we are a young university – my departments are only 17 years old – and our faculty is young and energetic. I would point to the strong entrepreneurial spirit and energy in our faculty, the excellent balance between senior and junior faculty, and the recent recruiting wins and rapidly improving pedigree of the engineering school and its departments. I’d mention the future growth of the university and its engineering school, by dint of the university’s mandate to serve the population of California as it grows. I’d mention the incredible quality of life here in San Diego. And then I’d say that while the future is bright even according to these considerations – and, really I’ve only listed a few raw metrics: papers, dollars, acres, temperatures – there is so much more at the heart of UCSD. Namely, there is a balance here between education, research, and a heartfelt commitment to serve at all levels. I find a compelling sense of humanity that pervades this university and its core principles – a deep sense of social responsibility to community, nation, and mankind. This is what makes UCSD unique. As I mentioned at the outset, I feel very lucky to be part of this institution. Email: abk@ucsd.edu WWW: http://vlsicad.ucsd.edu/~abk/********************************
Peggy Aycinena owns and operates EDA Confidential. She can be reached at peggy@aycinena.com
|