Last week, reporting showed that Grok, the AI system running on X, was generating thousands of sexualized images per hour, including content that should never exist on any platform, let alone one with global reach. The response was familiar. Shock, anger, finger-pointing, and then the quiet, defeated conclusion many people have already reached. Technology has gone too far, and the people building it can't be trusted.

That reaction makes sense. But walking away from technology because of its failures is exactly how we guarantee more of them.

When thoughtful people disengage from technology, power does not disappear. It concentrates in the hands of the few. That is how we quietly hand the future of work, culture, and civic life to people who should not be trusted to define it for everyone else.

I work in corporate and creative technology, I support AI, and I genuinely believe progress can make the world better. That places me squarely inside industries people love to criticize right now, and I understand why.

The backlash against technology did not come out of nowhere. We have watched a familiar pattern repeat. New tools arrive with big promises, then get used in ways that feel like weapons against regular people. Too often, the loudest voices are the ones who confuse money with wisdom and speed with inevitability. Technology stops being a way to lift people up and becomes a way to win power games, cut costs for shareholders, shed responsibility, and treat job loss as an acceptable side effect.

Tristan Harris, the co-founder of the Center for Humane Technology, has described part of this as output “without the tax of human labor.” That line lands because it captures something many people are feeling. The story being told is that humans are the expensive part of the system, and the goal is to remove them.

That damage is real, but the more serious consequence is cultural. These behaviors have made technology feel so toxic that capable, thoughtful people are opting out entirely. Artists, educators, designers, managers, and professionals with real curiosity and ethics look at tech culture and say, “I want no part of that.” You can see it in the backlash against generative tools, and in the exhaustion of workers who hear about AI only as a threat, long before it is ever offered as help.

The growing feeling is that technology is something done to people, not built with them.

And to be fair, there are real problems with how technology is being used today.

Is AI straining the electrical grid? Yes.
Is it using enormous amounts of clean water? Yes.
Is it generating a lot of low quality content trained on other people’s work? Yes.
Is it threatening jobs at many levels? Yes.
Is it being used for surveillance, fraud, and other harm? Yes.

Faced with this, a lot of people opt out because they want no part of it. But opting out does not stop the future. It simply hands it to someone else who is less concerned about the consequences.

What gets lost in the debate is that many of these problems are not unsolvable. They are design, incentive, and governance problems. Those are human problems, which means they are also human opportunities.

When technology is treated like infrastructure

Finland is a strong example of what it looks like when civic leaders treat technology as infrastructure and insist it deliver public value. Finnish cities asked a practical question. If we are going to run data centers, how do they help everyday people?

In Helsinki, waste heat from underground data centers is fed into municipal heating systems. That means entire neighborhoods are warmed by computation that already exists and is already being paid for. Energy ends up serving two purposes. Costs go down, emissions drop, and jobs are created maintaining the infrastructure. This is not a branding exercise. It is city government making a clear demand. If technology is going to exist at scale, it should create visible benefits for the public.

Seattle offers a different civic model, and it matters for a different reason. Public-interest technologists, city agencies, and universities have worked together to modernize permitting, housing access, and service delivery. The goal is not automation for its own sake. The goal is reducing friction for residents while keeping people in the loop. The result is faster services and better outcomes without turning the system into a black box or cutting humans out of the process.

These examples matter because they prove something important. Technology problems are often not moral failures alone. They are design choices. Design choices can be changed by people who stay engaged.

What this looks like inside companies

The same principle applies inside organizations. One of the most dangerous ideas circulating in corporate culture right now is that AI can replace human experience. It can't, at least not in any responsible way.

AI understands language extremely well, but it has no lived experience, no judgment, no empathy, and no accountability. It can explain every possible way to skydive, describe the physics, and summarize the sensations, but it has never stood in the door of a plane and jumped. That difference matters, especially when decisions affect real people.

Used poorly, AI becomes a blunt instrument for cutting costs and headcount. Used well, it removes drudgery, accelerates learning, and frees people to do work that actually benefits from human judgment.

Power at work does not only come from owning the company or the technology. It comes from understanding it well enough to shape how it is used. When people stop learning how tools work, they stop believing they have any right to influence the systems that govern their jobs. That is when decisions get made without them.

This is where it helps to point to a company that has built a more intentional relationship between mission, culture, and technology. Patagonia is not perfect, and no company is, but they are a useful example of top-down clarity combined with bottom-up participation. Leadership has been explicit about what the company is for, and technology teams are expected to support that mission rather than override it. Repair, resale, and supply chain transparency are not just marketing. They are systems and platforms that make the mission real in day-to-day operations, even when those choices do not maximize short-term growth.

That combination is the point. Mission is not a poster. It is a set of choices, reinforced by how you build, buy, and deploy technology.

What showing up actually looks like

Showing up does not mean everyone needs to become an engineer. It means asking better questions and volunteering for the work of shaping the answers.

Inside companies, that looks like leaders asking who a tool is for before asking what it costs. It looks like asking whether the tool saves time or simply removes roles. It looks like asking how decisions will be explained when something goes wrong, and who is accountable when they do. It looks like vetting partners not just on capability, but on ethics, safety practices, and transparency.

For those of us in the corporate trenches, it means joining pilot programs early, not after the rollout is a done deal. It means asking how success will be measured beyond short-term savings. It means pushing for training and literacy so teams can use tools well, instead of fearing them or being replaced by them.

Outside companies, it means engaging locally. It means paying attention when cities and towns adopt new systems, asking how data is collected and used, and supporting civic technologists and public-interest startups. It also means offering professional expertise to schools, nonprofits, and local governments that are navigating these tools without deep technical resources.

None of this is flashy. It is slower than hype, and less satisfying than outrage. But it works.

The future is going to be built by someone. If thoughtful, ethical people disengage, that future will be shaped by people who are not making decisions with your team, your community, or your family in mind.

Staying involved is not endorsement. It is responsibility, ownership and leadership.

Show up.