X Space Recap - IoT, Telemetry, Sensor & Machine Data w/ Filecoin Foundation IoTeX, WeatherXM, and DIMO

"NOAA allocates roughly $1.2 billion each year to gather weather data. WeatherXM can deliver comparable coverage for about 1% of that cost."

With this comparison, WeatherXM Co-Founder Manos Nikiforakis set the tone for Our X Space, “IoT, Telemetry, Sensor & Machine Data”, hosted alongside Filecoin Foundation

For July’s Space, we brought together:

Over 60 minutes the group examined how decentralised physical infrastructure (DePIN) is reshaping data markets. They walked through proof chains that begin at the sensor, incentive models that discourage low-quality deployments, and programmes that reward EV owners for sharing telemetry. 

DePIN is really on the rise. Networks are operating, real users are earning rewards, and real enterprises are consuming the data. 

The following excerpts highlight the key moments from our recent X Space, diving deep into the insights shared.

Turning vehicle telemetry into real savings

Drivers who join DIMO first mint a "vehicle identity", a smart-contract wallet that stores their car's data and keeps them in charge of access.

From there they can release selected metrics to approved parties. A leading example is usage-based insurance: share only charging-session records and receive an annual rebate.

As DIMO's CTO Yevgeny Khessin explained, the programme with insurer Royal pays "$200 back per year" for that data.

Each transfer is cryptographically signed, so ownership stays with the driver while the insurer receives verifiable telemetry.

This demonstrates how DePIN turns dormant signals into economic value—and why the underlying infrastructure matters.

The ledger as the "handshake" layer

The role of the ledger becomes clear when you consider interoperability requirements. It serves as the shared source of truth for composability between systems. New patterns emerge, such as vehicle-to-vehicle payments.

In a web2 world, that would have required coordination across Visa and multiple automakers. On a blockchain, the payment is straightforward because the parties meet on the same ledger.

The same "handshake" applies to data sharing. Yevgeny went on to highlight that using the ledger as the agreement layer, with private-key permissions, creates a system where only the owner can authorise access. This addresses recent controversies over car data being shared without clear consent.

Khessin summarised the benefits as ownership, interoperability, and verifiability. Looking ahead, machines will need to verify one another before acting on any message. A public ledger provides the audit trail for those decisions.

"Using a ledger as the source of truth for interoperability ... allows you to enable use cases that just weren't possible before."

This sets up the shift from isolated "data markets" to the protocol plumbing: identity, permissions, and payments negotiated on a neutral, verifiable layer.

Building the infrastructure: IoTeX's physical intelligence ecosystem

IoTeX positions itself as the connector between real-world data and AI systems through "realms". These are specialized subnets that allow DePIN projects to reward users with multiple tokens: their native token, realm-specific tokens, and IoTeX tokens.

Example: a mobility realm can reward a driver in both the project's token and an IoTeX realm token for verified trips, with proofs carried from device to contract.

The technical challenge is ensuring data integrity throughout the entire lifecycle: from collection at the device, through off-chain processing, to smart contract verification on-chain. Each step requires cryptographic proofs that the data hasn't been tampered with.

IoTeX tracks this entire chain to ensure enterprise customers can trust the data they consume. This becomes critical as AI systems increasingly depend on real-world inputs for training and decision-making.

Incentives as quality control: finding shadows, fixing stations

WeatherXM demonstrates how economic incentives can drive data quality at scale. They tie rewards directly to measurable performance metrics.

They start at the device level. Stations sign each sensor payload with secure elements embedded during manufacturing. Location verification uses hardened GPS modules paired to the main controller. This makes spoofing significantly more difficult and lets the network trace readings back to specific units and coordinates.

Deployment matters too. Operators submit site photos for human review, with machine vision assistance planned for the future.

Then comes continuous quality assurance. WeatherXM analyses time-series data to detect poor placements. Repeating shadows signal nearby obstacles, not moving cloud cover. Similar pattern recognition applies to wind and other environmental signals.

The incentive mechanism is direct: poor deployments earn reduced rewards. The system writes on-chain attestations of station quality over time, making performance transparent to participants and customers.

“We analyse time-series data to identify repeating shadows. That signals a placement issue. Poorly deployed stations receive fewer rewards, and we record quality on-chain.”

This approach addresses a persistent problem with legacy sensor networks. Many government and corporate deployments struggle with maintenance costs and quality control. Making quality transparent filters good stations from bad without requiring a central gatekeeper.

WeatherXM extends this concept to forecast accuracy, tracking performance across different providers and planning to move those metrics on-chain as well.

AI agents and the future of automated data markets

Manos described WeatherXM's implementation of an MCP client that allows AI agents to request weather data without navigating traditional API documentation.

More significantly, they're exploring x402 protocols that enable agents to make autonomous payments. Instead of requiring human intervention with traditional payment methods, an AI agent with a crypto wallet can pay for historical or real-time weather data on-demand.

This capability unlocks new use cases around prediction markets, weather derivatives, and parametric insurance—all operating on-chain with minimal human intervention.

As synthetic content proliferates, the value of verifiable real-world data increases. AI systems need trusted inputs, and the cryptographic proofs that DePIN networks provide become essential infrastructure.

The compounding effect

These developments create a compounding effect. Shared protocols reduce coordination costs by eliminating integration friction. Transparent incentive alignment drives up data quality across networks. Participants capture direct value from resources they already own.

The models discussed show machine data evolving from siloed exhaust to a trusted, priced asset that enterprises consume for real business applications.


Decentralized Storage Alliance opens our Group Calls for technical discussions on AI, DePIN, storage, and decentralization. If you're building in this space, we'd welcome your participation in our ongoing research and community initiatives.

Keep in Touch

Follow DSA on X: @_DSAlliance
Follow DSA on LinkedIn: Decentralized Storage Alliance
Become a DSA member: dsalliance.io

Next
Next

What the Tragedy at the Library of Alexandria Tells Us About The Importance of Decentralized Storage