Samsung unveils next-gen memory solution for ultra-large AI at MemCon 2023
Samsung unveils next-gen memory solution for ultra-large AI at MemCon 2023
  • Jung So-yeon
  • 승인 2023.03.31 12:59
  • 댓글 0
이 기사를 공유합니다

On March 28, Choi Jin-hyuk, head of the Memory Research Institute at Samsung's US subsidiary, gave a keynote speech on "Memory Innovation in the Data-Centered Computing Era" at MemCon 2023, held in California, USA. / Courtesy of Samsung Electronics

Samsung Electronics has unveiled a next-generation memory solution that can be applied to ultra-large AI at the global AI semiconductor conference, MemCon 2023.

On March 28, Choi Jin-hyuk, head of the Memory Research Institute at Samsung's US subsidiary, gave a keynote speech on "Memory Innovation in the Data-Centered Computing Era" at MemCon 2023, held in California, USA. This year's MemCon is the first to focus on AI memory solutions, with Samsung participating as a founding member, along with global IT companies such as Google and Microsoft, and memory, device, and server companies.

Choi introduced Samsung's next-generation memory technology, which can solve the bottleneck between processors and memory. These technologies include "HBM-PIM (Intelligent Memory)" which integrates computational functionality into high-performance memory, "PNM (Processing Near Memory)" which positions computational functionality next to memory, and "CXL (Compute Express Link) DRAM" which can expand system memory capacity up to terabytes (TB).

Samsung emphasized that "as ultra-large AI models like ChatGPT become more active, next-generation memory technologies are emerging," and introduced HBM-PIM and CXL-PNM as products that can dramatically improve GPU accelerator performance compared to existing solutions when applied to AI applications.

Choi explained that "large AI models like ChatGPT are estimated to experience delays up to 80% of the time due to memory bottlenecks, causing problems such as delayed sentence generation," and that "applying HBM-PIM technology to large AI models improves AI model generation performance by about 3.4 times compared to GPU accelerators with existing HBM."

The CXL-based PNM solution can increase memory capacity up to four times that of existing GPU accelerators using the easily expandable CXL interface, which also reduces data movement and increases AI model loading speed by over 2 times.

The next-generation storage device, "2nd Generation SmartSSD," can reduce data operation processing time and energy consumption based on significantly improved performance compared to the first generation. Additionally, the Memory-Semantic SSD provides a product suitable for data center AI and machine learning with random read and response speeds that are up to 20 times faster than regular SSDs.

Recently, the semiconductor industry has been focusing on the high performance of AI semiconductors. Cases of various future technologies such as autonomous driving, human robots, speech recognition, and metaverse are increasing, especially as the data flow becomes more complex.


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • ABOUT
  • CONTACT US
  • SIGN UP MEMBERSHIP
  • RSS
  • 2-D 678, National Assembly-daero, 36-gil, Yeongdeungpo-gu, Seoul, Korea (Postal code: 07257)
  • URL: www.koreaittimes.com | Editorial Div: 82-2-578- 0434 / 82-10-2442-9446 | North America Dept: 070-7008-0005 | Email: info@koreaittimes.com
  • Publisher and Editor in Chief: Monica Younsoo Chung | Chief Editorial Writer: Hyoung Joong Kim | Editor: Yeon Jin Jung
  • Juvenile Protection Manager: Choul Woong Yeon
  • Masthead: Korea IT Times. Copyright(C) Korea IT Times, All rights reserved.
ND소프트