INTEL INNOVATION 2022 – San Jose, California – Intel announced new hardware and software features for Project Amber, a secret computing service that combines hardware and software to validate and certify data, at this week’s Developer Innovation Summit.
The improvements include features to protect data from the time it leaves the system and is in transit, in use, or in a sleep state in storage.
“This is a core technology that Intel has been developing for years. Where it will be most important is the AI-ML models … to make sure when you run a model on the edge, it isn’t,” said Greg Lavender, CTO during his opening speech on Wednesday:
Data travels away when outside the data center, with multiple stops, until it reaches cloud services or completes a round trip to enterprise infrastructure. Information is added from sources such as sensors as data is transmitted over a communications network, with pauses and artificial intelligence chips ensuring only relevant data move forward.
Project Amber uses hardware and software technologies to verify that data packets and their original hardware are trustworthy. This layer of trust between devices and waypoints when data is in transit is a form of assurance that the company’s infrastructure and execution environment is secure, says Anil Rao, Microsoft’s vice president of systems engineering and engineering in the CTO’s office.
“Gone are the days when central hubs were just data carriers,” says Rao. “They are not just data transmission tools. They are intelligent data transmission tools.”
Showcasing secret computing is critical to an organization that mixes its own data sets with information from third parties to enhance AI learning models. Project Amber provides a way to ensure that data comes from reliable sources, says Rao.
Project Amber adds a stronger locking mechanism to protect data while it is being processed. The Trust Domain Execution (TDX) instruction, located on the company’s upcoming 4th generation Xeon Scalable processor, can secure an entire virtual machine as a trusted perimeter.
The data is secured so that the hypervisor – which manages and monitors virtual machines – cannot peek into the secret computing environment.
“Your app will still make a virtual machine entry and exit call, but during these calls, the data remains encrypted,” says Rao.
Today’s cloud computing environment is built on virtual machines, and applications do not run directly outside of processors, says Steve Lipson, Principal Analyst at Tirias Research.
“When we worked on processors, we didn’t need certification, because nobody was going to change Xeon. But a virtual machine — that’s just software, you can change it. Certification tries to provide the same kind of solidity for software hardware as silicon does for hardware processors,” says Lipson.
TDX is larger in its range than Secure Protection Extensions (SGX), which is a secure area of memory in which code is pushed, run, and run. SGX, a common feature on Intel chips, is part of Project Amber.
Intel’s Rao compares the TDX and SGX range to hotel rooms. If the TDX was a reliable limit in the form of a hotel room safe, then the SGX was a safe locker inside the hotel room.
Project Amber allows data to enter secure enclaves after matching digital codes issued by Amber Engines. If the codes match, the data can enter the secure area, but if not, entry will be denied because the data could have been changed, modified or compromised during the transfer.
“It’s like you give someone your VIN and say, ‘Is this the original VIN of my car or did someone do something terrible with this thing?’” Rao says.
Intel will also provide customers with the ability to define their own policies to create a reliable execution environment.
“You might want to process everything in the East Coast versus West Coast data center,” Rao says. “What Amber is saying is that this is exactly what it is – your code has not passed the policy.”
Protection in the clouds
Amber will support several cloud service providers, but Intel has not provided specific details.
“We want to make it multicloud so that it doesn’t need a different authentication mechanism as an organization when you go to different clouds,” Rao says.
There are hundreds of millions of Intel processors in data centers around the world, says Lipson, and bad actors have a constant ability to break into servers and steal secrets.
“It’s a cat-and-mouse game, and Intel is constantly trying to develop new ways to prevent bad guys from breaking into servers and stealing secrets. It stretches all the way from script kids, to teens hacking into state-sponsored sites,” says Lipson.
At some point, one has to think about protecting the used, moving and stored data. The Amber project was inevitable, especially as computing moves away from on-premises infrastructure to the cloud, says Lipson.
Project Amber is still in the experimental phase as Intel prepares the technology for vertically-certified computing models. Slideshow maker works with Research company Leidos For the use of Project Amber in the healthcare sector, which has many types of devices and sensors spread over large geographies and requires certification to ensure that systems receive only reliable data.