Cloudflare Brings AI to its Global Edge Network with NVIDIA

Cloudflare Brings AI to its Global Edge Network with NVIDIA

By the Editor of Hosting Journalist -April 14, 2021

Content delivery network provider Cloudflare has partnered with NVIDIA to bring AI to the edge at scale through its global edge network. The combination of NVIDIA accelerated computing technology and Cloudflare’s edge network would create a massive platform on which developers can deploy applications that use pre-trained or custom machine learning models in seconds.

Today’s applications use AI for a variety of tasks, from translating text on webpages to object recognition in images, making machine learning models a critical part of application development. Users expect this functionality to be fast and reliable, while developers want to keep proprietary machine learning models reliable and secured.

“NVIDIA offers developers AI frameworks to support applications ranging from robotics and healthcare to smart cities and now cybersecurity with the recently launched Morpheus,” said Kevin Deierling, senior vice president of networking at NVIDIA.
NVIDIA provides developers with a broad range of AI-powered application frameworks including Jarvis for natural language processing, Clara for healthcare and life sciences, and Morpheus for cybersecurity.

The combination of NVIDIA’s technology and Cloudflare’s edge network would allow for deployment of applications that use pre-trained or custom machine learning models in seconds. By leveraging the TensorFlow platform, developers can use familiar tools to build and test machine learning models, and then deploy them globally onto Cloudflare’s edge CDN.

“Cloudflare Workers is one of the fastest and most widely adopted edge computing products with security built into its DNA,” said Matthew Prince, co-founder and CEO of Cloudflare. “Now, working with NVIDIA, we will be bringing developers powerful artificial intelligence tools to build the applications that will power the future.”

Centralized servers or Cloud Regions
Machine learning models are often deployed on expensive centralized servers or using cloud services that “limit” them to cloud regions around the world. Together, Cloudflare and NVIDIA, intend to put machine learning within milliseconds of the global online population -enabling high performance, low latency AI to be deployed by anyone. And, because the machine learning models themselves will remain in Cloudflare’s data centers, developers can deploy custom models without the risk of putting them on end user devices where they might risk being stolen.

“As companies are increasingly data-driven, the demand for AI technology grows,” said Kevin Deierling, senior vice president of networking at NVIDIA. “NVIDIA offers developers AI frameworks to support applications ranging from robotics and healthcare to smart cities and now cybersecurity with the recently launched Morpheus.”

Internally, Cloudflare uses machine learning for a variety of needs including business intelligence, bot detection, anomaly identification, and more. Cloudflare uses NVIDIA accelerated computing to speed up training and inference tasks and will bring the same technology to any developer that uses Cloudflare Workers.

Headquartered in San Francisco, California, CDN provider Cloudflare has its offices located in Austin, TX, Champaign, IL, New York, NY, San Jose, CA, Seattle, WA, Washington, D.C., Toronto, Lisbon, London, Munich, Paris, Beijing, Singapore, Sydney, and Tokyo. NVIDIA is also headquartered in Silicon Valley.

Sign In or Register to comment.