(Source: Shutter stock)
Several progress around Beijing -based startups and the effects of its waves over the global AI surrounded the AI in China’s Front this week. In the headlines, the firm’s next big model was released from hardware -driven delay to domestic chips. In the United States, Open CEO Sam Altman has given the Chinese Open Source Pressure to adjust its company to adjust its model transparency strategy.
R2 launch postponed between Huawei Chip Push
It is expected that DPCAC is expected to follow up on R1 in January this summer, known as R2 internally. Instead, the rollout has been stopped when the firm has failed to complete the training runs on 910C processors in Huawei. Financial times. The report states that Chinese authorities, as part of China’s broader purpose to replace the US tech in the AI system, “encouraged Depsy to adopt Huawei Silicon instead of Navalia hardware.”
According to the report, Huawei chips have stability issues compared to NVIDIA products, including intergpu bendothy and inadequate software. Although Huawei had sent a team of engineers to help the R2 model development, Dupic had failed to receive a successful training with Essend Chip, but still working with Huawei to make the model compatible with such.
Delays in R2 highlights the key challenge facing China’s AI Push: Beijing wants domestic AI firms to prove that Chinese chips may match US products, yet today’s latest models still rely on NVIDIA’s software stacks and developer tools. Deep Sak, Miss Launch Window, can leave it to catch releases like Openi’s GPT5. The company says it is working to solve hardware problems before the end of the year, and Chinese media reports show that despite these shocks, R2 can be released in the next few weeks.
First movie of DPSC v3.1 with domestic chip mode
While the R2 has been delayed, Deep SEC proceeded with additional upgrades in its flagship V3 series. The V3.1 includes an FP8 precision format, which the company says has “improved soon to release domestic chips of the next generation,” but no chips or manufacturers have been explained. The FP8 format stores each parameter in just eight bits instead of 16 or 32, and half of the use of memory, which allows faster thermopotes on large models with low bandwidth and computer. In addition, like the Openi GPT5, the latest DPSEC’s latest V3.1 model also introduces a hybrid input switch that changes the output between reasoning and irrational methods, which can save the cost.
(Source: Shutter stock)
Updating this model to work with the Chinese hardware can mean that Deep See is preparing for the upcoming release of better home chips. Another gesture is in the market, as the Chinese semiconductor stock saw a huge leap after the announcement of DPCC. Established in 2016, Beijing’s chipmaker Cambricone Technologies saw its stock up to 20 % and market prices up to $ 70 billion, while foundry giants semiconductor manufacturing International Corporation and air hong semiconductor increased by 10 % and 18 % respectively. Since US export controls are strictly, it will be interesting to see which new chips can be issued in search of China’s AI self -reliance.
Sam Ultman cited Deep Sak in the axis of Openi’s Open Model
Recently, the Open has released its first set of openweight models since 2019, known as the GPTOSS. Sam Ultman told CNBC that the decision was “a factor”, noting the Chinese open source efforts, “it was clear that if we did not do so, the world was being built mostly on Chinese open source models.”
Ultman’s remarks indicate a change since January, when he describes the Deep Sak’s R1 as “impressive”, but still maintains that the scale of the openness of the open will keep it ahead of it. The opposite also shared his views on US export controls: “My focus she doesn’t work,” she said. “You can overcome one thing, but may not be the right thing … maybe people make fibers or find another task.”
Openi’s open weight is included in the GPT-OSS other open model families such as Alibaba’s Kevin and Meta Lama. Open weight models, or public downloadable parameters, allow researchers to run on their hardware rather than rely on host API. This flexibility is very important for scientific use matters where reproductive capacity, transparency, security and costs are important. The GPT-OSS model was also issued under Apache 2.0 licenses, which means researchers can use the model freely, modify them, and re-divide them, which can accelerate its options and innovation.
Techway
DPCAC’s hardware challenges, the V3.1 axis toward domestic chips, and the open weight release of the open shows a tense impression of chip policy, model design, and licensing strategies in the Pacific. China’s pressure for self -reliance is forcing its labs to innovate around the domestic silicon, while US rivals still hold hardware and software keys for AI Kingdom. This loop of scientists may result in a wide range of models and hardware, but when using these tools for Frontier Scale research, it will also require cautious benchmarking and cross potential testing.
Relevant







