A team of researchers from Google has unveiled a new AI model that can come up with complex chip designs in hours – a burdensome, intricate task that typically takes months for human engineers to complete.
The researchers used a dataset of 10,000 chip layouts to feed a machine-learning model, which was then trained with reinforcement learning. It emerged that in only six hours, the model could generate a design that optimizes the placement of different components on the chip , to create a final layout that satisfies operational requirements such as processing speed and power efficiency.
The method’s success is such that Google has already used the model to design its next generation of tensor processing units (TPUs), which run in the company’s data centers to improve the performance of various AI applications.
- AMD unveils ‘world’s best’ Ryzen PRO 5000 Series mobile processors for business
- With Epyc Milan launch, AMD aims to push further into the enterprise
- Linus Torvalds tears into Intel, favors AMD
- Introducing the ARM processor (again): What you should know about it now
- The open-source RISC-V is prompting chip technology breakthroughs (ZDNet YouTube)
“Our RL (reinforcement learning) agent generates chip layouts in just a few hours, whereas human experts can take months,” tweeted Anna Goldie, research scientist at Google Brain, who took part in the research. “These superhuman AI-generated layouts were used in Google’s latest AI accelerator (TPU-v5)!”
Modern chips contain billions of different components laid out and connected on a piece of silicon the size of a fingernail. For example, a single processor will typically contain tens of millions of logic gates, also called standard cells, and thousands of memory blocks, known as macro blocks – which then have to be wired together.
The placement of standard cells and macro blocks on the chip is crucial to determine how quickly signals can be transmitted on the chip, and therefore how efficient the end device will be.
This is why much of engineers’ work focuses on optimizing the chip’s layout. It starts with placing the larger macro blocks, a process called “floorplanning” and which consists of finding the best configuration for the components while keeping in mind that standard cells and wiring will have to be placed in the remaining space.
The number of possible layouts for macro blocks is colossal: according to Google’s researchers, there are a potential ten to the power of 2,500 different configurations to put to the test – that is, 2,500 zeros after the 1.
What’s more: once an engineer has come up with a layout, it is likely that they will have to subsequently tweak and adjust the design as standard cells and wiring are added. Each iteration can take up to several weeks.
Given the painstaking complexity of floorplanning, the whole process seems an obvious match for automation. Yet for several decades, researchers have failed to come up with a technology that can remove the burden of floorplanning for engineers.
Chip designers can rely on computer software to assist them in the task, but it still takes many months to work out how to best assemble components on the device.
And the challenge is only getting harder. The often-cited Moore law predicts that the number of transistors on a chip doubles every year – meaning that engineers are faced with an equation that grows exponentially with time, while still having to meet tight schedules.
This is why Google’s apparently successful attempt to automate floorplanning could be game-changing. “Very nice work from Google on deep RL-based optimization for chip layout,” tweeted Yann LeCun, chief AI scientist at Facebook, congratulating the team on overcoming “40 years” of attempts at resolving the challenge.
Google’s new AI model could hardly land at a better time: the semiconductor industry is currently rocked by a global shortage of chips that is hitting a number of sectors, ranging from consumer electronics to automotive .
While the shortage has been caused by insufficient capabilities at the fabrication level, rather than the design of semiconductors, it remains that cutting the time that it takes to invent next-generation chips could constitute a welcome relief for the entire supply chain.
Scientific journal Nature, for one, welcomed the new method. “Researchers at Google have managed to greatly reduce the time needed to design microchips,” they said . “This is an important achievement and will be a huge help in speeding up the supply chain.”
Although the machine-learning model could impact the industry as a whole, it will be worth keeping an eye on Google’s own use of the technology, too.
The search giant has long been explicit that its ambition is to create custom processors in-house, particularly in the form of systems-on-chips (SoCs) .
- 'Major Leap’: US Air Force Uses AI to Copilot Military Plane for First Time in History - Photos
- Air Force uses AI on military flight for first time
- Google Pixel 4a review: Where sense meets simplicity
- AI Algorithms Are Slimming Down to Fit in Your Fridge
- This is how Google’s internet-serving Loon balloons can float for nearly a year
- AI: A force for social empowerment
- 'They are willing to sacrifice everything': Ai Weiwei pays tribute to the Hong Kong protesters
- The dark side of sex robots: Experts worry humanoid lovers could crush human limbs or be used as tools for government spying
- Google Hires New Personnel Head Amid Rising Worker Tensions
- Next year, every company — especially Google — needs to embrace big phone sensors
- A quick tour of what you missed at the NeurIPS 2020 AI conference
- Trailblaizing end-to-end AI application development for the edge: Blaize releases AI Studio
- How to maximize the benefits of AI while curtailing potential risks?
- Microsoft designing its own chips for servers, surface PCs
- Bosch wants to take fitness trackers to the next level with its new AI chip
- See the hypnotic meditation rooms Google plans to add to some offices
- Humans are causing animals and plants to go extinct 1,000 times FASTER than their natural rate, warn scientists
- AI is playing a major role in precision medicine for targeted disease management: Rajashree Damle, Capgemini
- The Turing Test is obsolete. It’s time to build a new barometer for AI
- Bed Bath & Beyond's digital transformation: Turnaround plan features heavy dose of Google Cloud
Now Google is using AI to design chips, far faster than human engineers can do the job have 1018 words, post on www.zdnet.com at June 11, 2021. This is cached page on TechNews. If you want remove this page, please contact us.