Lexar, a leading high-performance memory brand, has introduced the AI Storage Core, a revolutionary new memory solution designed to tackle the biggest data and performance challenges posed by the shift of Artificial Intelligence from the cloud to the Edge.
This new standard specifically targets the exploding market of AI-enabled endpoints, including AI PCs, intelligent vehicles, robotics, and AI in games, where traditional storage methods can no longer keep pace with real-time AI workloads. The AI Storage Core combines high-speed performance with a rugged, hot-swappable design, setting a new benchmark for intelligent AI Storage for Next-Generation Edge AI Devices.
The three core innovations built for the AI era
The AI Storage Core is engineered to address limitations in performance, durability, and flexibility, which are critical for any device operating autonomously in the field.
High performance for AI acceleration
To eliminate the severe latency often experienced when running large AI models (LLMs) on-device, the AI Storage Core is optimized for efficiency:
- Speed: It utilizes a PCIe 4.0 interface, delivering sequential read/write speeds that significantly surpass traditional memory cards. Based on technical analysis, speeds are estimated to be over 7,000 MB/s read and 6,000 MB/s write.
- Optimization: Lexar has advanced small-block (512B) I/O optimization through features like SLC Boost and Read Cache layers. This specialized firmware is crucial for improving the responsiveness of real-time AI tasks, such as accelerating LLM loading and generative image workflows.
- Capacity: The modules will be offered with capacities up to 4TB.
High reliability for harsh environments
Edge AI devices frequently operate in demanding, non-climate-controlled settings. The AI Storage Core is built to endure these conditions:
- Construction: The memory is encased in an integrated, protective packaging that makes it dustproof, waterproof, shock-resistant, and radiation-resistant.
- Temperature Tolerance: Selected upcoming models are designed for extreme conditions, supporting a wide temperature operation range of –40°C to 85°C, which is essential for autonomous driving and industrial robotics.
High flexibility for cross-device AI collaboration
The form factor introduces a new level of utility and maintenance efficiency for enterprise customers:
- Form Factor: It is a Micro-SSD with a rugged casing (approx. 22.9mm x 28.5mm), resembling a compact M.2-2230 SSD but with enhanced protection.
- Hot-Swappable: The design allows users to insert or remove the device while the system is actively running. This is a feature typically reserved for much slower memory card formats or complex enterprise storage arrays.
- PCIe Boot Support: The module is bootable, meaning users can launch the entire operating system, applications, and data directly from the Core, enabling true system portability.

Why this innovation matters: Lexar vs. traditional storage
The AI Storage Core’s true significance lies in its combined feature set, which addresses the three major deployment limitations of Edge AI hardware: bottlenecked performance, environmental fragility, and complexity of deployment.
Comparison: AI storage core vs. market standards
| Feature | Lexar AI Storage Core | Traditional M.2 NVMe SSDs | Industrial/Consumer Memory Cards (SD/CFexpress) |
| Performance Focus | AI-Optimized: Specialized firmware (SLC Boost, Read Cache) for exceptionally small-block I/O (crucial for LLM loading). | Optimized for sequential R/W and general PC performance. | Optimized for sequential R/W (large photo/video files), limited random R/W. |
| Ruggedness | Extreme Industrial Grade: Dustproof, Waterproof, Shock-resistant, Wide-Temperature (-40°C to 85°C). | Limited consumer-grade temperature and shock tolerance. | Basic weather resistance, but not industrial temp/shock rated. |
| Flexibility | Hot-Swappable & PCIe Bootable. | Cold-Swappable: Requires system shutdown for removal/insertion. | Hot-swappable, but limited by lower capacity and significantly slower speed. |

The critical breakthroughs
Eliminating the performance bottleneck for LLMs
When running large AI models, the device constantly swaps fragmented data between RAM and permanent storage. Traditional storage creates an I/O bottleneck here. The Lexar Core’s small-block I/O optimization is specifically designed to minimize this congestion, ensuring data flows quickly enough for the processor to deliver real-time AI responses without stuttering.
Enabling true AI portability
The combination of the hot-swappable design and PCIe boot support is revolutionary for enterprise and industrial use. A company can now store the entire AI’s “identity” (OS, trained models, security logs) on a Core module. This allows technicians to:
- Perform maintenance and instant upgrades without interrupting the entire system’s operation (zero downtime).
- Quickly transfer a fully configured AI system from one physical robot or AI PC to another simply by swapping the module.
Purpose-built for the edge
The AI Storage Core is tailored to solve specific, high-demand challenges across five key segments:
- AI Robotics: The compact design fits space-constrained robots, while the ability to swap modules easily enables fast intelligence, identity, and security upgrades on factory floors or in logistics scenarios.
- AI PC: High speed and capacity accelerate LLM workflows and generative tasks, with hot-swap support enabling full mobile-workstation portability.
- AI Gaming: High IOPS and fast random-read performance reduce load times and stutter, supporting high-frame-rate rendering and real-time AI interactions.
- AI Camera: Sustained performance supports continuous 4K/8K video capture and real-time AI processing, with the shock-resistant construction being ideal for outdoor imaging.
- AI Driving: Capable of ingesting multi-sensor data streams from cameras, radar, and LiDAR, with wide-temperature and shock-resistant models ensuring stable operation in extreme automotive conditions.
Conclusion: A new benchmark for intelligent storage
The Lexar AI Storage Core reflects the deep understanding that, as AI models become larger and more complex, the limitations shift from pure processing power to data accessibility. By delivering a solution that matches the extreme performance, rugged reliability, and practical flexibility demanded by next-generation endpoints, Lexar is helping to accelerate the global adoption of AI Storage for Next-Generation Edge AI Devices.
