- The researchers claim that Apache Parquet had a maximum severity defect
- Allows threat actors to execute arbitrary code
- A patch was launched and users are urged to apply it
Apache Parquet, a columnar storage file format, carried a maximum severity vulnerability that allowed threat actors to execute arbitrary code at the affected final points.
Parquet is an optimized columnar storage file format for efficient data storage and processing, commonly used in Big Data and Analytics workloads, with Amazon, Google, Microsoft and Meta, only some of the large companies that use it.
The error, seen on April 1, 2025 by Amazon security researcher, Key Li, is now done as CVE-2025-30065, and has a maximum gravity score: 10/10 (critic).
Patch and mitigations
“The analysis of schemes in the Avro Parquet Module of Apache Parquet 1.15.0 and the previous versions allow the bad actors to execute arbitrary code,” a brief description is read on the NVD page. “Users are recommended to update version 1.15.1, which solves the problem.”
According to reports, the problem comes from the deerialization of non -reliable data, which allows threat actors to obtain control of the objective systems through special parquet files.
The warning here is that the victim must be deceived to import the files that, suggest the researchers, means that the threat is not so imminent, despite the 10/10 score.
Those who cannot update their Park Apache instances to version 1.15.1 immediately avoid the non -reliable parquet files, or at least carefully analyze them before taking measures.
In addition, IT equipment must monitor and register their parquet processing systems more closely these days.
At the time of publication, there was no evidence of abuse in nature, although computer pirates usually begin to scan for vulnerable final points once a patch is released, betting that many organizations do not apply it in time.
Through Bleepingcomputer