
While giving his testimony in court, Elon Musk revealed that xAI used OpenAI to train its own models, and he went so far as to say that the idea of “using other AIs to validate your AI” is just normal practice. His comment reveals that the AI industry often allows some mixing of different AI models, and it also stirs up very good questions for blockchain ecosystems that are getting more and more dependent on AI infrastructures.
Cross-Model Training as Industry Norm
By saying that using outside models for validation is just normal, Musk reflects the fact that such practices are very common in the machine learning workflow. For example, developers frequently use ready-made large language models for generating synthetic data, benchmarking outputs, or even small tuning. Likewise, decentralized AI networks and Web3 protocols also use such methods not only to get off the ground but also to keep compute costs at a minimum.

Also Read: Elon Musk’s xAI Sues Colorado Over New AI Regulation Law in 2026
Intellectual Property and Data Provenance Issues
The statement highlights the points that intellectual property, licensing, and data lineage training are issues that are yet to be addressed. The supporters of blockchain believe that unchangeable ledgers and cryptographic proofs can solve the provenance problems by timestamping datasets and model weights.
There are projects that are concentrated on decentralized storage and zero-knowledge proofs that are trying to create a model that can be used to check if a model has been trained on data that is proprietary or open-source. However, legal standards for AI-to-AI training are still not clear, raising compliance issues for DAOs and tokenized AI platforms that are operating globally.
Also Read: SpaceX Unveils Bold $7.5 Trillion Valuation Target in Ambitious Musk Compensation Plan
What Decentralized AI Governance Means
Musk’s comments were made while the crypto communities were discussing the different governance models for AI deployment. DAOs that support open-source models can only afford to look at the balance between the speed of innovation and getting the regard and accountability of making the models.
Licensing based on smart contracts, on-chain royalties, and token-based data contribution are the tools to the possible solutions. Still, these standards are difficult to enforce when most of the foundational models have been trained using other commercial systems. The whole episode points to the fact that transparent AI governance is becoming the main focus for blockchain infrastructure providers and decentralized compute networks.
Also Read: Crypto Cards Dominate Payments as Spending Skyrockets 500%