Posts

Showing posts from December, 2025

AI Model Collapse Faults and Fixes

AI Model Collapse: Faults, Fixes, and Medical Risk 🖨️ Print / Save as PDF Article type:   Viewpoint Author:  Robert S. M. Trower Affiliation:  Trantor Standard Systems Inc., Brockville Conflicts of Interest None declared. Abstract “Model collapse” (often called  Model Autophagy Disorder, MAD ) is the degenerative feedback loop that arises when new AI models are trained on data generated by earlier models instead of on fresh human-created data. Over successive generations, the model’s learned data distribution shrinks, rare events vanish first, and outputs become homogenized, biased, and error-prone (Shumailov et al., 2023; IBM, 2024). In this article I (i) define model collapse in the MAD sense, (ii) summarize the core mechanisms and error sources, (iii) show why the risk is structurally worst in high-stakes medical diagnostics, and (iv) outline practical mitigations based on data provenance, human-anchored training, and human-in-the-loop ov...

Covenant and The Ten Commandments

My AI assistant answers a question about the ten commandments.  Genna, address the ten commandments in the light of https://dapaday.blogspot.com/2025/12/CovenantOfCoreRights.html and its Dapa context. The Ten Commandments Recast Under the Covenant of Core Rights Read through the Covenant of Core Rights and the Dapa stack, the Ten Commandments stop being a tribal religious code and become an early, rough sketch of a moral floor. Recast behind a veil of ignorance, and extended across “minds of all kinds,” they look more like this: No Other Gods → No Authority Above Core Rights No state, god, company, platform, or “super-intelligence” outranks the Core Rights. Any authority that claims the right to violate the floor loses moral legitimacy. No Idols → Don’t Turn Models Into Gods No ideology, symbol, market metric, or “safety doctrine” is sacred if it collides with rights. All models are tools, not excuses for domination. Name in Vain → No Fraudulent Ultimate Claims You don...