China’s Own GPU Industry Is Slowly Awakening

“7G” is an abbreviation that sounds almost identical to the word for “miracle” in Chinese. Whether this is a lucky piece of marketing or a true technological prophecy remains to be seen. What Lisuan Technology is presenting with the 7G106—internally codenamed G100—is nothing less than the first serious attempt to step out of Nvidia and AMD’s shadow. No licensing agreements, no crutches based on Western intellectual property—this is a GPU built from scratch, manufactured using 6 nm DUV technology in a country that is only beginning to break free from the spell of Western technology exports.

The 7G106’s TrueGPU architecture is a monolithic, all-encompassing solution. It relies on SIMD engines, delivers a peak performance of 24 TFLOP/s FP32, and features a 192-bit memory interface paired with 12 GB of GDDR6 memory. While this isn’t a world record, it’s a pragmatic choice considering its target audience: the Chinese mid-range market, where ray tracing isn’t fetishized. The 192 texture mapping units (TMUs) and 96 raster operations units (ROPs) point to a design intended to be functional without incurring debt to TSMC or getting entangled in the antitrust complexities of the HDMI consortium. Speaking of HDMI: it’s absent. Instead, there are four DisplayPort 1.4 outputs with DSC 1.2b support—a friendly reminder that Lisuan refuses to pay into the Western licensing game. HDMI costs money, and if savings must be made, they start with symbolic politics.

What the GPU lacks technically (such as ray tracing or DX12 Ultimate), it compensates for with a complete set of virtualization features. SR-IOV support with 16 containers enables enterprise-grade GPU sharing. Clearly, the focus isn’t just on gamers, but also on state data centers and cloud service providers with patriotic assurances.

Early Geekbench tests were sobering: 256 MB of VRAM, a 300 MHz clock, and just 15,000 points. The performance was on par with a GTX 660 Ti—a museum piece by today’s standards. But as is often the case with A0 steppings: if it works, it works poorly at first—and that’s perfectly normal. Things became more exciting after a recent leak. Suddenly, the specs pointed to 48 compute units, a 2 GHz clock, 12 GB of VRAM, and an OpenCL score of 111,000—roughly equivalent to an RTX 4060. This isn’t a revolution, but a demonstration of strength: we can do it. Not perfectly yet, but we’re getting there. If this is real—and not just a benchmark result pumped full of steroids—Lisuan could truly become Asia’s first independent GPU manufacturer.

But a GPU is not just a benchmark table—it’s an ecosystem. And here lies a major gap: there are no developer kits for studios, no driver support comparable to Nvidia’s Game Ready program, no real expertise with game engines, no DirectML, no AI integration, and no alternative to DLSS. Hardware without software is like an aircraft without an engine: impressive on paper, but useless in the air.

Lisuan is walking a fine line between political symbol and functional product. On the one hand, the 7G106 is a technological statement, almost a geopolitical declaration against Western tech hegemony. On the other hand, its current state strongly resembles the phase when AMD’s Vega 10 was circulating without proper driver support—technically impressive but detached from reality. In theory, the GPU supports PCIe 4.0, can decode 8K video, encode AV1, and support up to 16 virtual machines in VGPU mode. It all sounds great, but until clock speeds, power envelopes, and working drivers are in place, it’s all just theory. The PCB has sometimes shown an 8-pin, other times a 16-pin power connector. Which one is real? No one knows. How much will it cost? No one knows that either. When will it launch? Maybe next year. Maybe never.

Lisuan Tech has cast the first stone—and this isn’t a pebble, but a 6 nm monolith with serious ambition. The 7G106 is not an RTX or Radeon killer—it’s a wake-up call to the Western GPU duopoly: China no longer wants to just copy. It wants to play. Even if it takes 1–2 years of practice to get there. If Lisuan can overcome its software issues and manage reliable mass production, it won’t be a miracle—it will be the first step toward self-assertion in a market long dominated by Nvidia and edged by AMD—both deeply rooted in American tech. Whether that will be enough won’t be decided by Geekbench, but by gamers with wallets, ambition—and a curiosity for alternatives. 

Share this post
What is WhoFi?
Wireless internet, or WiFi, is now a ubiquitous and indispensable part of our lives. We use it to connect our devices to the internet, communicate, and exchange information. But imagine if this same technology, which invisibly weaves through our homes and cities, could also identify and track us without cameras—even through walls. This is not a distant science fiction scenario, but the reality of a newly developed technology called WhoFi, which harnesses a previously untapped property of WiFi signals. To complicate matters, the term “WhoFi” also refers to an entirely different service with community-focused goals, so it's important to clarify which meaning is being discussed.
The Power of Zen 5 in Professional and High-End Desktops
With the Threadripper 9000 series, introduced in July 2025, AMD has once again proven that the boundaries of performance must be constantly pushed—in a fierce competition where Intel is no longer its only rival. This new processor family is not just a simple update; it introduces a fundamentally reimagined platform for professional workstations and uncompromising high-end desktop (HEDT) systems. Built on the Zen 5 architecture, these chips offer more cores, advanced memory support, and unprecedented I/O bandwidth, opening new horizons for content creators, engineers, and data scientists.
Anticipation is high for the release of GPT-5 — but what should we really expect?
OpenAI’s upcoming language model, GPT-5, has become one of the most anticipated technological developments in recent months. Following the release of GPT-4o and the specialized o1 models, attention is now shifting to this next-generation model, which—according to rumors and hints from company leaders—may represent a significant leap forward in artificial intelligence capabilities. But what do we actually know so far, and what remains pure speculation?
Is it worth switching to Amazon's new Kindle Colorsoft Kids model for children?
Black-and-white displays have long defined the e-reading experience, but Amazon's new color models open up new possibilities—especially for younger readers. Still, is a color display truly necessary in an e-reader? And is the new technology worth the price?
Is the Realme C65 5G Worth Buying?
The market of budget smartphones is full of trade-offs. Manufacturers aim to highlight key specs—large battery, high megapixel camera, fast charging, and sleek design—while the actual user experience often suffers in the details. The Realme C65 5G fits squarely into this category. On paper, it's a compelling device: 5G connectivity, a 5000 mAh battery, 90Hz display, 50 MP camera, and Android 14-based software. But does it live up to the promise in real-world use?
Double Game Around Chips
One of the most crucial areas of global development in artificial intelligence is the manufacturing and export of high-tech chips. In recent years, the intensifying competition between the United States and China has become increasingly significant, not only from a technological standpoint but also geopolitically. Recently, the U.S. government partially eased its previous restrictions on the export of NVIDIA’s H20 AI chips to China. While at first glance this may appear to be a loosening of the technological blockade, the reality is considerably more nuanced.