Close Menu
    Trending
    • ADB unveils fund to speed ASEAN power grid plans
    • South Korea sets February current account record
    • TCL Celebrates its 100 Million Air Conditioner Units Milestone with Launch of New Smart Factory
    • Shenzhen Port tops 8.52 million TEUs in first quarter
    • West Africa LNG Group Highlights Strategic Progress on Guinea LNG Project at Powering Africa Summit
    • Gotion Leads Launch of Europe-Africa Electric Logistics Corridor
    • Datavault AI CEO Nathaniel T. Bradley to Deliver Flagship Keynotes on Breakthrough RWA Tokenization at CONV3RGENCE London and AssetRush × Zurich 2026
    • Mount Semeru erupts seven times in East Java
    • Home
    • Contact Us
    Tunisian PostTunisian Post
    Wednesday, April 8
    • Automotive
    • Business
    • Entertainment
    • Health
    • Lifestyle
    • Luxury
    • News
    • Sports
    • Technology
    • Travel
    Tunisian PostTunisian Post
    Home » Samsung targets first quarter HBM4 deliveries for AI boom
    Technology

    Samsung targets first quarter HBM4 deliveries for AI boom

    February 9, 2026
    Facebook WhatsApp Twitter Pinterest LinkedIn Telegram Tumblr Email Reddit VKontakte

    SEOUL: Samsung Electronics said it is on track to begin delivering its next-generation HBM4 high-bandwidth memory products in the first quarter of 2026. The company said the HBM4 lineup will include products with 11.7 gigabits-per-second performance, as it expands sales of memory used in artificial intelligence servers and accelerators. Samsung did not name customers for the initial HBM4 deliveries, and it did not disclose shipment volumes in its earnings materials.

    Samsung targets first quarter HBM4 deliveries for AI boom
    Samsung says HBM4 deliveries start in Q1 2026 as 11.7 Gbps memory targets AI servers worldwide.

    In its fourth-quarter and full-year 2025 results, Samsung said its memory business posted record highs in quarterly revenue and operating profit, supported by higher sales of high-value products including HBM, server DDR5 and enterprise solid-state drives. The company said limited supply availability remained a factor even as demand for AI computing continued to lift consumption of advanced memory and storage used in data centers.

    High-bandwidth memory is a vertically stacked DRAM technology designed to increase data throughput compared with conventional DRAM, and HBM4 is the newest generation following HBM3E. In an investor presentation accompanying the earnings release, Samsung said it plans to start delivering HBM4 “mass products,” including an 11.7 Gbps version, and cited “timely shipment” of HBM4 as part of its near-term outlook for AI-related products.

    Nvidia, the largest supplier of AI data center accelerators, has introduced its Rubin platform, which it says uses HBM4 across multiple system configurations. On Nvidia’s product specifications page for the Vera Rubin NVL72 rack-scale system, the company lists 20.7 terabytes of HBM4 with 1,580 terabytes per second of bandwidth, and 288 gigabytes of HBM4 with 22 terabytes per second of bandwidth for a single Rubin GPU, noting the figures are preliminary and subject to change.

    Rubin platform memory requirements

    Samsung’s results statement also pointed to broader work across advanced semiconductor manufacturing and packaging linked to AI computing. It said its foundry business commenced mass production of first-generation 2-nanometer products and began shipments of 4-nanometer HBM base-die products, components used in the logic layer of high-bandwidth memory stacks. Samsung said it plans to provide optimized solutions through integration of logic, memory and advanced packaging technologies.

    Other major memory makers have also published timelines for their HBM4 readiness. SK hynix said in September 2025 that it completed development and finished preparation of HBM4, and that it had readied a mass production system. Micron said in a December 2025 investor presentation that its HBM4, with speeds over 11 Gbps, is on track to ramp with high yields in the second calendar quarter of 2026, consistent with customers’ platform ramp plans.

    Competing HBM4 road maps

    In describing its own HBM4 program, Micron said its HBM4 uses advanced CMOS and metallization technologies on the base logic die and DRAM dies, designed and manufactured in-house, and pointed to packaging and test capability as critical to performance and power targets. SK hynix has described HBM4 as part of a generational progression in stacked memory built for ultra-high performance AI, where bandwidth and power efficiency are central requirements for data center operation.

    Samsung’s earnings materials did not link its HBM4 delivery schedule to any specific AI processor program or customer deployment. Nvidia’s Rubin announcements and published specifications do not identify HBM4 suppliers, and Nvidia has not disclosed vendor allocations for the HBM4 used in Rubin systems. Samsung’s confirmed timeline, as stated in its results release, is that HBM4 deliveries are expected to begin within the first quarter of 2026. – By Content Syndication Services.

    Related Posts

    ADB unveils fund to speed ASEAN power grid plans

    April 8, 2026

    South Korea sets February current account record

    April 8, 2026

    Shenzhen Port tops 8.52 million TEUs in first quarter

    April 7, 2026

    Mount Semeru erupts seven times in East Java

    April 6, 2026

    Türkiye raises power and gas prices by up to 25%

    April 6, 2026

    UAE enters global top 10 exporters in WTO rankings

    April 6, 2026
    Latest News

    ADB unveils fund to speed ASEAN power grid plans

    April 8, 2026

    South Korea sets February current account record

    April 8, 2026

    Shenzhen Port tops 8.52 million TEUs in first quarter

    April 7, 2026

    Mount Semeru erupts seven times in East Java

    April 6, 2026
    © 2026 Tunisian Post | All Rights Reserved
    • Home
    • Contact Us

    Type above and press Enter to search. Press Esc to cancel.