Breaking news
China Telcom’s AI Overview Institute claims it trained a 100-billion-parameter model the utilization of only domestically produced computing power – a feat that potential Center Kingdom entities are now not colossally perturbed by sanctions that stifle exports of Western tech to the country.
The model is is known as TeleChat2-115B and, in step with a GitHub update posted on September 20, modified into once “trained entirely with domestic computing power and open sourced.”
“The open source TeleChat2-115B model is trained using 10 trillion tokens of high-quality Chinese and English corpus,” the project’s GitHub page states.
The page also contains a touch about how China Telecom may be pleased trained the model, in a mention of compatibility with the “Ascend Atlas 800T A2 training server” – a Huawei product listed as supporting the Kunpeng 920 7265 or Kunpeng 920 5250 processors, respectively running 64 cores at 3.0GHz and forty eight cores at 2.6GHz.
Huawei builds these processors the utilization of the Arm 8.2 architecture and bills them as produced with a 7nm direction of.
- By strategy of cloud, it be China against the arena
- Alibaba Yitian 710 rated quickest Arm server CPU in the cloud (for now)
- Chinese CPUs to feature in servers made by sanctioned Russian firm
- Huawei hands its cloud Linux to China’s only commence source foundation
At 100 billion parameters, TeleChat2 trails the likes of contemporary Llama models that curiously high 400 billion parameters, or Originate AI’s o1 which has been guesstimated to were trained with 200 billion parameters. Whereas parameter count alone doesn’t identify a model’s power or utility, the low-ish parameter count suggests training TeleChat2 would likley be pleased required less computing power than modified into once wanted for other projects.
Which would be why we cannot get a mention of a GPU – though the Ascend training server has a truly modest one to force a display at 1920 × 1080 at 60Hz with 16 million colors.
It as a result of this truth appears to be like be pleased the infrastructure used to put collectively this model modified into once now not at parity with the extra or less rigs on hand outside China, suggesting that tech export sanctions are now not struggling with the Center Kingdom from pursuing its AI ambitions.
Or that it must bring in other ways, equivalent to China Telecom’s good scale. The provider has revenue of over $70 billion, drawn from its provision of over half of a billion wired and wi-fi subscriptions. It’s miles also one of the largest customers and promoters of OpenStack. Even with out get admission to to the most modern and most tantalizing AI hardware, China Telecom can muster deal of power. ®