China makes AI breakthrough, reportedly trains generative AI model across multiple data centers and GPU architectures

Beijingwalker

Elite Member
Joined
Nov 4, 2011
Messages
77,012
Reaction score
104,323
Country of Origin
Country of Residence

China makes AI breakthrough, reportedly trains generative AI model across multiple data centers and GPU architectures​

News
By Jowi Morales
September 29 2024

AI data center

(Image credit: Shutterstock)

An industry analyst recently revealed that China has developed a single generative AI (GAI) model across multiple data centers — a massive feat considering the complexity of using different GPUs in a single data center, let alone using servers in multiple geographic locations. Patrick Moorhead, Chief Analyst at Moor Insights & Strategy, said on X (formerly Twitter) that China was the first country to manage this achievement and that he discovered it during a conversation about a presumably unrelated NDA meeting.

This technique of training GAIs across different locations/architectures is essential for China to keep its AI dreams moving forward, especially as American sanctions have stopped it from acquiring the latest, most powerful chips to drive its research and development. Since Nvidia does not want to lose the Chinese market, it created the less powerful H20 AI chips that fall within Washington’s restrictive performance parameters. However, there are rumors that even these down-tuned chips might be banned soon, highlighting the uncertainty Chinese tech companies face in the current political climate.

Because of this uncertainty, Chinese researchers have been working on melding GPUs from different brands into one training cluster. By doing so, the institutions could combine their limited stocks of sanctioned high-end, high-performance chips, like the Nvidia A100, with less powerful but readily available GPUs, like Huawei’s Ascend 910B or the aforementioned Nvidia H20. This technique could help them combat the high-end GPU shortage within China, although it has historically come with large drops in efficiency.

However, it seems that China has found ways to solve this issue, especially with the news of the single GAI development across multiple data centers. Although we don’t have any information on this GAI yet, it shows the lengths that Chinese researchers will go to, to ensure that they can continue driving China’s AI ambitions forward. As Huawei said, China would find ways to continue moving its AI development despite American sanctions. After all, necessity is the mother of invention.
 

China makes AI breakthrough, reportedly trains generative AI model across multiple data centers and GPU architectures​

News
By Jowi Morales
September 29 2024

AI data center

(Image credit: Shutterstock)

An industry analyst recently revealed that China has developed a single generative AI (GAI) model across multiple data centers — a massive feat considering the complexity of using different GPUs in a single data center, let alone using servers in multiple geographic locations. Patrick Moorhead, Chief Analyst at Moor Insights & Strategy, said on X (formerly Twitter) that China was the first country to manage this achievement and that he discovered it during a conversation about a presumably unrelated NDA meeting.

This technique of training GAIs across different locations/architectures is essential for China to keep its AI dreams moving forward, especially as American sanctions have stopped it from acquiring the latest, most powerful chips to drive its research and development. Since Nvidia does not want to lose the Chinese market, it created the less powerful H20 AI chips that fall within Washington’s restrictive performance parameters. However, there are rumors that even these down-tuned chips might be banned soon, highlighting the uncertainty Chinese tech companies face in the current political climate.

Because of this uncertainty, Chinese researchers have been working on melding GPUs from different brands into one training cluster. By doing so, the institutions could combine their limited stocks of sanctioned high-end, high-performance chips, like the Nvidia A100, with less powerful but readily available GPUs, like Huawei’s Ascend 910B or the aforementioned Nvidia H20. This technique could help them combat the high-end GPU shortage within China, although it has historically come with large drops in efficiency.

However, it seems that China has found ways to solve this issue, especially with the news of the single GAI development across multiple data centers. Although we don’t have any information on this GAI yet, it shows the lengths that Chinese researchers will go to, to ensure that they can continue driving China’s AI ambitions forward. As Huawei said, China would find ways to continue moving its AI development despite American sanctions. After all, necessity is the mother of invention.


0       0   Screenshot 2022-04-21 160357.png
chinese1-1.png
 

China makes AI breakthrough, reportedly trains generative AI model across multiple data centers and GPU architectures​

News
By Jowi Morales
September 29 2024

AI data center

(Image credit: Shutterstock)

An industry analyst recently revealed that China has developed a single generative AI (GAI) model across multiple data centers — a massive feat considering the complexity of using different GPUs in a single data center, let alone using servers in multiple geographic locations. Patrick Moorhead, Chief Analyst at Moor Insights & Strategy, said on X (formerly Twitter) that China was the first country to manage this achievement and that he discovered it during a conversation about a presumably unrelated NDA meeting.

This technique of training GAIs across different locations/architectures is essential for China to keep its AI dreams moving forward, especially as American sanctions have stopped it from acquiring the latest, most powerful chips to drive its research and development. Since Nvidia does not want to lose the Chinese market, it created the less powerful H20 AI chips that fall within Washington’s restrictive performance parameters. However, there are rumors that even these down-tuned chips might be banned soon, highlighting the uncertainty Chinese tech companies face in the current political climate.

Because of this uncertainty, Chinese researchers have been working on melding GPUs from different brands into one training cluster. By doing so, the institutions could combine their limited stocks of sanctioned high-end, high-performance chips, like the Nvidia A100, with less powerful but readily available GPUs, like Huawei’s Ascend 910B or the aforementioned Nvidia H20. This technique could help them combat the high-end GPU shortage within China, although it has historically come with large drops in efficiency.

However, it seems that China has found ways to solve this issue, especially with the news of the single GAI development across multiple data centers. Although we don’t have any information on this GAI yet, it shows the lengths that Chinese researchers will go to, to ensure that they can continue driving China’s AI ambitions forward. As Huawei said, China would find ways to continue moving its AI development despite American sanctions. After all, necessity is the mother of invention.
Bullshit article.

It is expected that someone like Patrick Moorhead would fall for this kind of stupidity. The guy has no technical background and is essentially a sales guy by training and experience.

@Nilgiri

We use Horovod for distributed machine learning a lot at my workplace. Its MPI architecture for training deep-learning models.

Use Horovod for distributed parallel processing and Tensorflow for deeplearning framework. Tensorflow has support for both CUDA (nVidia GPU API) and OpenCL via Coriander cross compiler. Admittedly, support for opencl is sketchy but it is very usable.

We have a heterogeneous cluster since sometime now. Older + new NVIDIA devices and even few AMD device.

Its funny that China just now learnt about Horovod? Its been out since last year now. 🤭
 
Bullshit article.

It is expected that someone like Patrick Moorhead would fall for this kind of stupidity. The guy has no technical background and is essentially a sales guy by training and experience.

@Nilgiri

We use Horovod for distributed machine learning a lot at my workplace. Its MPI architecture for training deep-learning models.

Use Horovod for distributed parallel processing and Tensorflow for deeplearning framework. Tensorflow has support for both CUDA (nVidia GPU API) and OpenCL via Coriander cross compiler. Admittedly, support for opencl is sketchy but it is very usable.

We have a heterogeneous cluster since sometime now. Older + new NVIDIA devices and even few AMD device.

Its funny that China just now learnt about Horovod? Its been out since last year now. 🤭
Talk to him

Jowi Morales

Contributing Writer
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
 
Talk to him

Jowi Morales

Contributing Writer
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

Look at his education and experience... He has no education in technology or engineering. He is just getting excited on a single tweet by that Moor guy, who himself is from sales.

He is a fluff writer. Not even close to have any idea about what he is talking of.

Its like hollywood's writer on hacking. It looks flashy but is completely bullshit.
 

Look at his education and experience... He has no education in technology or engineering. He is just getting excited on a single tweet by that Moor guy, who himself is from sales.

He is a fluff writer. Not even close to have any idea about what he is talking of.

Its like hollywood's writer on hacking. It looks flashy but is completely bullshit.
You can post yours here too, and you can also write your own article to send to news channels to see if they like to publish them.
 
You can post yours here too, and you can also write your own article to send to news channels to see if they like to publish them.
Thing is, one posts articles about things that are interesting.

Things that are routine is not news worthy. What he is talking about is not really that big of a deal and is done in industry quite routinely. You do not make a news out of it.
 
Thing is, one posts articles about things that are interesting.

Things that are routine is not news worthy. What he is talking about is not really that big of a deal and is done in industry quite routinely. You do not make a news out of it.
Talk to the author or the news channel, not me.
 
If I m interested , I will do research online, not from a nobody like you.
HAHAHAHA.

Look, before talking to me, you probably did not even know what Horovod was, what MPI is, what distributed DNN training is. All you did was that you saw "China" "Breakthrough" "AI" and "GPU". And you posted it here.
Heck you could not be even bothered to look into background of people who are responsible for this "News".
Sure!
 
HAHAHAHA.

Look, before talking to me, you probably did not even know what Horovod was, what MPI is, what distributed DNN training is. All you did was that you saw "China" "Breakthrough" "AI" and "GPU". And you posted it here.

Sure!
Lol , so you still insist I m interested in this specific field? I told you to talk to the author and the new channel, not me.
 
Lol , so you still insist I m interested in this specific field? I told you to talk to the author and the new channel, not me.
Look, you posted it, so I believed you are intersted. If you are not, well most of the world is free for thoughts so yeah, do not read stuff.
 
Look, you posted it, so I believed you are intersted. If you are not, well most of the world is free for thoughts so yeah, do not read stuff.
You are contradicting yourself, If I m interested , I will do research online, not from a nobody like you. you can also write your own article to send to news channels to see if they like to publish them.
 

Users who are viewing this thread

Back
Top