What you’ll learn:
- Get introduced to Avishtech and its holistic approach to PCB design and simulation.
- How minute changes to PCB design at high frequencies and data rates can lead to re-spins.
- How simulation that accounts for real-world materials properties can accurately predict ground-plane losses.
Today’s high-frequency, high-data-rate products (112 Gb/s/channel and 5G/mmWave applications) deliver mountains of data in a hurry. That can come at the cost of challenges to traditional PCB design approaches. PCBs meant for these kinds of products afford very little “wiggle room.” The slightest misstep in the PCB design can result in a product that’s unmanufacturable, inoperable, or both. While traditional EDA toolsets provide insight into the specifics of designing a working PCB, they don’t offer any perspective into the critical types of manufacturing information that can make or break a PCB design.
I had the chance to sit down with Keshav Amla, founder and CEO of Avishtech, an EDA startup that has developed two new EDA toolsets—a stackup tool, Gauss Stack, and a 2D field solver, Gauss 2D, both of which are meant to build a bridge between EDA, manufacturing processes, and end-product reliability.
How did Avishtech come to be?
My experience, interests, and education lie in mechanical engineering and materials science and engineering. While I was pursuing my undergrad at Caltech, I planned to start a company.
Avishtech began while I was a grad student at Stanford. Our co-founder and CTO is my father, Dr. Tarun Amla, whose primary educational background is in mechanical engineering, modeling and simulation, and electrical engineering. More critically, he’s a leading expert in the electronic materials sector and has been central to many of the key materials advances in the industry for a long time.
For many years, he’s had concerns regarding the gap that exists between the PCB design/development process and subsequent manufacturing processes. As these high-frequency/high-data-rate products evolved, that gap reached a critical point at which the slightest change in the design stage could result in a failed product at the manufacturing, test, or field operations stages.
We envisioned significant enhancements to the PCB stackup design process that would enable a product to work right the first time and every time thereafter. These enhancements incorporate a real integrated approach in which product developers can carry out electromagnetic and thermomechanical simulations, manufacturability checks, and reliability predictions before committing to manufacturing operations.
What are the key issues that people need to worry about with high-frequency/high-data-rate designs?
Of course, the primary concern for these applications is signal integrity with a heavy emphasis on impedance and losses. Typically, field solvers do a good job of modeling impedances. But at higher data rates and frequencies, the proximity effect becomes quite significant and ground-plane losses can account for 20% or more of the total loss.
Beyond this, there are several things that product developers haven’t had to consider before as critically in terms of manufacturability and reliability. The specific constructions used in a particular stackup design can lead to issues such as glass stop, which is responsible for problems like conductive anodic filaments (CAFs), voiding, and drill-induced crazing.
When you move into high-frequency, high-data-rate designs, those are the hidden “gotchas” that will determine if your product works as designed or even if the product can be successfully manufactured. Relying on just one or two material attributes will not suffice for these types of end-application products.
How were these issues addressed before?
Previously, the process consisted of “best guess” estimations, or, more often, general rules of thumb that could only be verified by building a test vehicle board. If problems were detected at this level, the only solution was to go back to square one and start the design process all over again, often with a new material, which may or not have been the culprit in the first place. This led to time-intensive and expensive rework that could then impact critical time-to-market windows, competitive advantages, and overall profitability of a product line.
The old paradigm was the only one available at the time, so the seemingly endless series of product re-spins was expected and accepted. But, with the current set of challenges for high-frequency/high-data-rate designs, the tolerances have reduced to the point that product developers are hitting these walls more often, which calls for an integrated approach.
What are the types of data that your technology can provide?
For instance, by accounting for the ground-plane losses I mentioned earlier, something that’s not available in competitive offerings, we can accurately model insertion losses. We can also accurately predict resin starvation and glass stop that includes the effects of dielectric filler and conductor roughness.
At the highest level, we accurately simulate the PCB thermomechanical properties that are critical for reliability predictions. Then, based on these thermomechanical properties, we can predict plated through-hole, microvia, and solder-joint reliability. In addition, we simulate for impedance, frequency, and roughness-dependent losses that are associated with the dielectric materials. Not only that, but we can also flip the problem to simulate the line widths required to achieve a target impedance for the entire stackup.
A number of these operations are accomplished rapidly with a single click and virtually no learning curve, all in a fully integrated environment.
Many EDA tools offer large databases of materials information. How is your technology any different?
It’s not just about having massive amounts of data. It’s the ability to quantify and qualify that data in such a way that it’s relevant to your design before you commit it to hardware. We haven’t just built another iteration of the same mouse trap—a broad-based laminate database.
We have incorporated a level of intelligence into our technology in the form of a proprietary algorithm that’s able to extract detailed mechanical properties at the polymer level that are critical to the thermomechanical simulations performed by our stackup tool. These properties are stored in our library for use as inputs into our thermomechanical simulations, which provide board-level properties that can’t be simulated through other means.
As I’d mentioned earlier, these simulations allow us to determine how such properties will impact the design, manufacturability, long-term operability, and reliability of the end product. All of the necessary functionality is built directly into the toolset. And, we have done extensive validation of our predictions on several built test vehicle boards. All of our results have been within experimental error of the test measurements.
How have the materials suppliers responded to the information you incorporated into your toolset?
They have responded quite well. Our materials library consists of permittivity (Dk) and dissipation-factor (Df) data and construction lists that come from the laminate materials vendors. They’ve been happy to provide this data and are pleased that we can provide the detailed thermomechanical information on boards built using their materials.
Beyond the product solutions we deliver to our customer base, we’re focused on educating the market. We’re educating our end users, the materials suppliers, and the fabricators. The concept of providing a bridge between design and manufacturing is a new one and there are several audiences for the information that we provide.
The characterization capabilities you provide sound quite complex. How easy is it to use your stackup tool?
Actually, it’s quite easy to use and that’s been something that we have focused on since day one; it also ties into our focus on education. Any EDA toolset is only as good as the designer using it. And, quite honestly, that’s been the shortcoming with a lot of the competitive products available in the marketplace.
Only the most skilled engineers could utilize the toolsets and even after they attained that information, the process would fall short, because these offerings would still not provide much of the most important information pertinent to manufacturability and durability in the field. So, that meant several simulation iterations followed by several real-world validation iterations to learn whether the material and design they had chosen would work in their product.
To make matters worse, some minor modifications to the constructions used in the stackup would inevitably occur between the prototyping stage and the manufacturing stage, leading to problems that seemed to come out of nowhere. This is where our broader vision comes into play. It doesn’t do any good to specify a material for a certain design only to find out that the design won’t be manufacturable, reliable, or meet long-term operation requirements. We built our Gauss products specifically to bridge this gap.
What’s the response been thus far to your approach to the design process and the technology that you have created?
It’s been very positive. Product developers are very excited to see a tool that provides a new breadth and depth of insight into the PCB product development process. Fabricators are enthusiastic that our tool provides a new level of validation that the designs they receive from their customers will be manufacturable in volume.
Likewise, business managers welcome a solution that enables them to create a product-development timeline that’s achievable and cost-effective and will allow them to deliver a very competitive, profitable product to the marketplace. We’ve been seeing strong and enthusiastic adoption of our offerings across the market.
In addition, we have received critical validation from one of the industry’s foremost experts in PCB design and simulation, Lee Ritchey. As is well-known, Lee is skeptical when it comes to companies that introduce new EDA toolsets and he has never publicly supported a design tool before. We demoed our tool for him, and he then provided us with the stackup of a product that he had previously designed. The results we provided to him were the same that he got, but he had to build a test board to acquire them.