Jonathan Ross, chief govt officer of Groq Inc., throughout the GenAI Summit in San Francisco, California, US, on Thursday, Might 30, 2024.
David Paul | Bloomberg | Getty Photographs
Synthetic intelligence semiconductor startup Groq on Monday introduced it has established its first knowledge middle in Europe because it steps up its worldwide growth.
Groq, which is backed by funding arms of Samsung and Cisco, mentioned the information middle might be situated in Helsinki, Finland and is in partnership with Equinix.
Groq is seeking to reap the benefits of rising demand for AI providers in Europe following different U.S. corporations which have additionally ramped up funding within the area. The Nordics specifically is a well-liked location for the information services because the area has easy accessibility to renewable power and cooler climates. Final month, Nvidia CEO Jensen Huang was in Europe and signed a number of infrastructure offers, together with knowledge facilities.
Groq, which is valued at $2.8 billion, designs a chip that the corporate calls a language processing unit (LPU). It’s designed for inferencing fairly coaching. Inferencing is when a pre-trained AI mannequin interprets dwell knowledge to provide you with a outcome, very similar to the solutions which might be produced by in style chatbots.
Whereas Nvidia has a stranglehold on the chips required for coaching enormous AI fashions with its graphics processing models (GPUs), there’s a swathe of startups hoping to take a slice of the pie in terms of inferencing. SambaNova; Ampere, an organization SoftBank is within the course of of buying; Cerebras and Fractile, are all seeking to be a part of the AI inference race.
There are a variety of areas the place Groq desires to face out from its rivals, together with Nvidia, in keeping with CEO Jonathan Ross.
In a Monday interview with CNBC, Ross mentioned that Nvidia chips will use costly elements resembling high-bandwidth reminiscence, which at present have only a few suppliers. Groq’s LPUs in the meantime don’t use such chips, and the corporate’s provide chain is broadly primarily based in North America.
“We’re not as provide restricted, and that is necessary for inference, which may be very excessive quantity, low margin,” Ross instructed CNBC’s “Squawk Field Europe.”
“And the rationale that we’re so good for Nvidia’s shareholders is, we’re joyful to take that prime quantity however decrease margin enterprise and let others concentrate on the high-margin coaching.”
Ross additionally touted Groq’s skill to deploy its expertise at velocity. He mentioned that the corporate determined 4 weeks in the past to construct the information middle in Helsinki is at present unloading its server racks into the situation now.
“We anticipate to be serving visitors beginning by the top of this week. That is constructed quick and so it is a very totally different proposition from what you see in the remainder of the market,” Ross mentioned.
European politicians have been pushing the notion of sovereign AI — the place knowledge facilities have to be situated within the area. Knowledge facilities which might be situated nearer to customers additionally assist enhance the velocity of providers.
International knowledge middle builder Equinix connects totally different cloud suppliers collectively, resembling Amazon Net Providers and Google Cloud, making it simpler for companies to have a number of distributors. Groq’s LPUs might be put in contained in the Equinix knowledge middle permitting companies to entry Groq’s inference capabilities through Equinix.
Groq at present has knowledge facilities within the U.S. and Canada and Saudi Arabia with its expertise.
Keep forward of the curve with NextBusiness 24. Discover extra tales, subscribe to our publication, and be a part of our rising group at nextbusiness24.com

