Zethu Lubisi, Acting ICT Service Delivery Manager: Planning and Governance at the University of the Witwatersrand.
YOUR COMPANY’S AI IS MAKING DECISIONS BASED ON GARBAGE DATA, AND CLUELESS WORKERS ARE TO BLAME. A SHOCKING new expose reveals that corporations are hurtling toward an AI-powered future with a workforce UNABLE to understand the very systems governing their jobs, finances, and lives.
According to Zethu Lubisi of Wits University, a so-called “data skills gap” is NOT a minor HR issue—it’s a ticking time bomb of MISINFORMATION and UNCHECKED CORPORATE POWER. Companies are dumping BILLIONS into flashy AI dashboards and automation tools, but employees are utterly LOST, unable to interpret the data driving critical decisions. This isn’t just incompetence; it’s a SYSTEMIC FAILURE that hands unchecked authority to algorithms NO ONE understands.
Lubisi, set to speak at the ITWeb Data Insights Summit, warns this gap is a DIRECT THREAT to ethical governance. “Decisions are being made based on data that people don’t fully understand or trust,” she states, revealing a chilling reality: YOUR job, your loan, your healthcare outcomes could be dictated by insights that are a MYSTERY to the very people deploying them. This is a CRISIS of accountability wrapped in Silicon Valley hype.
The root cause? A BRUTAL corporate negligence. “Technology has moved faster than workplace learning,” Lubisi charges, accusing firms of prioritizing shiny tools over human competence. This creates a dangerous new caste system: a small elite of data priests versus a vast majority of IGNORANT workers, left behind and excluded from the future. It’s not a skills gap—it’s a deliberate WALL protecting power.
Worse still, Lubisi reveals this “inclusion issue” means the future workplace will ACTIVELY PUNISH those not fluent in data, cementing inequality. Her so-called “solutions”—role-based learning and supportive cultures—sound like a desperate band-aid on a hemorrhaging wound. The primary takeaway is a damning indictment: AI DOES NOT REMOVE ACCOUNTABILITY, IT AMPLIFIES HUMAN FAILURE.
We are blindly trusting systems that no one fully comprehends, creating a world where critical thinking is obsolete and bias is automated. The question is no longer if your data is wrong, but how many lives will be ruined before we admit the truth.




