Meet TurboQuant: The Korean Professor's Tech Behind Semiconductor Stock Plunge

KAIST Prof. Han In-su Briefs on TurboQuant · Data Compression Cuts Memory Usage · Broad Applications from On-Device AI to Search Algorithms

News|
|
By Seo Ji-hye
||
null - Seoul Economic Daily Technology News from South Korea

Google's recently unveiled artificial intelligence memory optimization technology "TurboQuant" is shaking the semiconductor market. In simple terms, TurboQuant dramatically reduces the memory required when large language models (LLMs) like ChatGPT process longer and more complex queries. After Google announced TurboQuant, the memory semiconductor market trembled. On the 30th (local time), the semiconductor index on the New York Stock Exchange plunged 4.23%, while shares of Micron (9.88%) and SanDisk (7.04%) also fell sharply. Concerns spread that demand for memory semiconductors could decline.

The new technology that shook the global semiconductor market was developed based on two papers written by Han In-su, a professor in the School of Electrical Engineering at KAIST, South Korea's top science and technology university. "The more complex and longer the problem to solve, the more memory is needed, and costs and response times increase accordingly," Han said during an online briefing held at KAIST on the 30th. "TurboQuant can be applied directly to a user's existing model to reduce memory while minimizing performance degradation." Below is a Q&A based on Professor Han's explanations.

Q. What is TurboQuant?

null - Seoul Economic Daily Technology News from South Korea
null - Seoul Economic Daily Technology News from South Korea

AI-translated from Korean. Quotes from foreign sources are based on Korean-language reports and may not reflect exact original wording.