Will AI Replace Your Work?

Will AI Replace Your Work?
Photo by Igor Omilaev / Unsplash

What I find quite amusing is the misconception of AI and how it works. I've seen videos and heard opinions about all sorts of crazy stuff; AI being a demon, AI is going to replace most jobs, AI is somehow going to overthrow humanity... it's all sort of crazy.

While I am no "AI expert" here, I do understand a lot about technology and how it works, so AI doesn't scare me in that way. It does put some questions, maybe even a little fear in me, but nothing that should really have me worrying about a career or anything. I truly believe we are still on the ground floor of what AI is, what it can do, and the importance of it in society.

Let's take a second to define AI real quick; it's basically a system that is next level Google. Imagine if Google took your input, did the research for you, and presented you with results without ever having to click a link. In fact, Google already does that. How often have you used Google in the past year and seen the AI recap at the top and never touched a website beyond Google.

AI is not a "sentient being" that can think, feel, and understand you. No, rather it's scrounger, holder, and collector of data to produce a result. When you ask it about say "why do I have mental fog right now?" - it's not relating to you. It's not feeling what you're feeling. In fact, it's often not even asking you any questions. It's scrounging the internet to find things that are somewhat in line with what you asked it, compiling all that data, making an educated guess from code on what you're looking for, and delivering the results in a formatted way.

If I may use an analogy, let's say you're making a sandwich. You know you need bread, meat, cheese, lettuce, sauce, and pickles. Great, so you start putting together your sandwich and realize, you don't know where the pickles are. Well dang. Google will tell you "hey the pickles are right here in this spot." You grab the pickles and slap it on the sandwich, and you consume.

Let's say you're making a sandwich. You tell AI the type of sandwich and what you want on it. AI from the get go will tell you what you need, put everything on the table, and put it together. All you have to do is consume.

Where Google is more like a map of what's in the fridge and where it's located in the fridge, AI is more like if the fridge was a vending machine. Punch in some numbers, get what you want.

Now that's out of the way, let's talk about the topic at hand; will it replace you? I won't make this too long, so here the main point;

AI is Essentially A Tool

A great one at that. So let's back up in time... let's say to a couple of years before Google really became a thing. Let's say you were in basic tech support. You went to school and got a degree. You've been in the industry working on computers, programs, backend systems, for the past 10 years. You have loads of knowledge, experience, and know all the tricks in the book. 10 years of hard work makes you quite the beast; a true value in the marketplace and they pay you for it. You start hold a job for a few years.

One day, your company hires some kid right out of high school. No additional schooling or training. He sits right next to you, it's their very first job. Now there's a task to setup a system that only you know how to work on at this company, but the kid is eager and hungry to learn, and takes on the task. A few hours later... they actually complete it. Confused you are. It took you a bunch of years of playing with this system to understand it at a level that's pretty deep. This task, should've been something this kid would have been relying on you to have the knowledge to complete; asking questions, shadowing, etc. - but no. It's done.

Turns out... he knew how to use Google. Typed in some key words, "how to setup X system," dug through some links, poked around on the system, and boom, it was up. Did he know exactly what he was doing? No honestly. He was however, able to do the work without your help. It's extremely impressive to management and to you, but now you're worried.

"Man, all these years of dedication, discovery, schooling, and hard work, and someone who doesn't know ANYTHING about what I do can type a few words in this Google thing and do what it took me a decade to figure out?" You knew Google. You've used it before. You just had no idea it could this powerful.

Fast forward to today, and now AI is in the exact same boat. I think we're all well aware of AI and use it to some degree, but we have yet to understand how we can use it in situations beyond - "there's something wrong with my dog, what is it." As a leader, and also as a guy who's got a boss who loves AI, I've got to see how it's being used by these tech folks getting in early on the game. Here's a few things I've seen in the past few months where I work -

  • Professional customer facing emails being completely AI generated and sent.
  • AI being able to summarize walls of texts and messages into digestible paragraphs, both for engineers and leaders.
  • Specific chat bots are being made with AI to help low level engineers do basic tasks.
  • Tasks that would normally take engineers a few weeks to automate being done in seconds with an ask to AI.
  • AI systems integrated into company systems and can gather data at will and present it without all the manual hassle.
  • People actively using AI to fool you in interviews (yes, I was duped a couple of times).
    • Hold this one in the back of your mind for a bit.
  • AI being able to develop templates, starter code, and documentation in seconds.
  • Meetings are being recorded, automatically making action items, collecting data, summarizing points, etc.

And I am sure with enough thought, I could come up with more examples. Keep in mind though, my company is full of early adopters of technology, so I don't expect this to be the standard in a year or so out beyond the walls and computers here.

Is AI replacing Humans?

The simple answer is no. At least not where I am at. AI is still in what I'd consider the beginning phases of it's capabilities. Can it do a LOT? Absolutely, but it's not yet at the point where it's straight up replacing people.

In my opinion however, it's going to advance what we are capable of doing very quickly. Notice how in all of the examples I have up there, none of them are say... automatic. It's all taking things you're already doing, and making it faster, easier to do, and creating a more effective workflow. Otherwise you could say, it's helping us do things faster with less knowledge.

Just as Google took the idea of buying a giant set of encyclopedias, organizing them in your house, and then having to dig through it to find out what a Kangaroo is and turned that into typing "kangaroo" in a box and clicking a link... AI is doing the same thing. It's what we're already doing, but faster, cheaper, and more effective.

What are AI Challenges?

It all depends on what type of challenge you're talking about. Of course it's a new technology, so security, best practices, proper usage, access, etc. - all those things are still up for debate and are actively being worked on all the time. Also, some people just suck, so they use it to attempt to scam and hurt people. This isn't new though. The better question here would be - "What new challenges does AI present?"

Now we get into a different ballgame. One honestly that I share some concerns with, but not really a whole bunch. I mean Google has been around for what... 30 years or something? Still to this day, the amount of times I have to tell tech folks to Google stuff astounds me. It's like my best friend at work. I would think it would be common knowledge that if you have a question to Google it, but oftentimes I have to Google it for them.

Which leads to the concern I have with folks that use it for laziness. It's not perfect, and you still have to ask it the right things in the right ways and dig through what it gives you. However, your average person just takes things at face value. When you have something as intensive as AI though... sometimes it's a little off.

Like for example, I see it being used as a "therapist" online. The problem I have though, is that's a little too affirming. I've done a heavy amount of comparison between what my therapist, a license individual with 20+ years experience, and what ChatGPT tells me. There are times where if I was telling my therapist about something, he would challenge me. He would dig into the why, and dig into my emotional state. ChatGPT was often like "You are amazing, insightful, and those thoughts are incredibly deep" and immediately dump 1500 words on actionable things to do, half of which aren't even relevant. Another simpler example would be searching up instructions for my RainByrd system (it's a sprinkler system). I gave it the model number, my issue, and in many different ways, and it never gave me the exact thing to do. I am smart enough that it gave me the right idea, just not the right instruction set.

The other concern I have, which could also be spun as an opportunity, is that right now in tech (likely other industries too), AI can be used as a loophole, or shortcut rather to gain some real visibility and advancement. Those who hop on the train now, they'll make it pretty far, until they need their knowledge to carry them. That's where engineers who have experience will shine over them. I have a sneaky suspicion, based on some of the experiences I've seen so far just with automation, is that you'll learn how to rely on AI to do the job, rather than actually learning to do the job. It puts you with a great head start, but if you aren't keeping up with it, then you'll end up stuck somewhere in the future.

One more example of that specifically; today, in between writing this article, I needed to run some queries on a system using a language I have no freaking clue how to use. I used AI. I told it what I needed, and it gave me a query. I got most of what I needed... however I still don't know anything about the language. If you took AI away, or I needed to create something on the fly without the experimentation, I couldn't do it.

The last concern I have is honestly around how AI will shape worldviews of people. This thing can literally tell you how to think. You can even tell it to pretend it's certain type of person, or in a certain role, and it will happily oblige and provide you with results. You can have it pretend to be evil, pretend to be religious, make it act like an assistant, craft emails, etc. and so on.

Again, I can't stress this enough; AI isn't human. It doesn't think or feel. It just performs tasks. That's very important for my last topic.

The Corporate Side of AI

I have a whole article about corporate culture elsewhere on here, and one of the points on there sticks out - which is how corporations tend to view people as resources, not as people. Here's where I already see AI being a bit of an abuse by these businesses.

AI can do things faster. It can often do it with less mistakes. A lot of your simpler data gathering tasks and work AI can do with a mere few words in a box. So naturally, any business entity is going to focus on how it can leverage AI to do these things. There's two problems that will come about here; they won't give people time to adapt, and the multi-hat roles will become more abundant. Not that they aren't already, but imagine if say, for those of you in tech, you have a scrum master. This person's whole job is to develop workflow, organize work, gather data, perform retro's, document progress, and update stakeholders. AI... can literally do all of that with a little bit of programming and user input. So as a business... if I can get my managers to do that instead, and in fact if I can have just ONE manager prove that they can program AI to do scrum, I can save my company a whole like million dollars in resources and completely get rid of that scrum team.

But that's a bad idea. From my perspective, what makes a job fulfilling and worthwhile is the way that your effort turns into results that help other people, in some way, shape, or form. It's in our internal DNA to serve, please, and produce. Of course there's limits to that. The general idea is that people working together create something that AI just cannot do - which is culture.

Corporate cares not about your culture. Corporate cares not for employee 6912. Corporate cares about the business entity. Corporate cares about progress. Corporate cares about the shareholders. What often happens is that C-levels often hit this point where they see this absolutely dominating effect of people, the culture, and the innovation, and then they start slapping numbers, values, and productivity measurements on it, and boom, the money starts flowing in... while all the people slowly give way to "quiet quitting." At first, it's usually a pretty healthy balance of "wow AI is amazing and we challenge you all to play with it and use it!" and then quickly turns into "you must use AI in 1/3 of your goals, they must connect to all stakeholders, and you need to have 5 meetings with your manager to discuss the nature of what you're doing and validate it aligns with company values."

I also want to make a prediction here as well; AI will quickly become a way to shortcut quality. It happens with people and automation already. If AI can help deliver a product in 30 days that would normally take a year, then corporate is going to jump on it. But why is that a problem? Wrong question. "Why does it take a year in the first place?" - and the answer is actually quite simple. AI is not human, so it's not fully encompassing all that is required of what you're building. A good chunk of that time is spent solving a problem, and when you solve the problem, your solution and/or product becomes higher quality. However, corporations will take AI, push garbage products out into the market with minimal value, and people will consume it. They'll make a bunch of money, get a few positive reviews, and repeat the process. Money now, issues later.

Anyway, enough of my complaining about corporate culture. On to the last point.

Human Touch vs AI

For even more emphasis on it - AI IS NOT HUMAN. It has no feelings, thoughts, emotions, etc. It is in fact, just a machine compiling data and spitting it out to you in a way you can read it (much like all of technology).

However... can it do things that mimic human expression? Can it invoke emotion in others? Is it able to create things that seem human, but only created from empirical data of what is known to "look human?"

In short... yes it can. Which creates in my opinion a serious problem with human connection. We live in a world where "perfection" is slowly taking over the arts. I think AI is going to accelerate that. There are certain things it can't replace though, but by and large, it can do most.

Go play with the image generators. It makes some breathtaking images. Go to one of the AI app builders. It can do everything in like 5 minutes. Search AI music generators and play with the prompts. It makes something that if you didn't know AI made it, then you'd think it was a legit artist. Go have ChatGPT or Claude write you poetry in a certain style, and refine it a few times, give it some feedback. It'll have something deep and thought provoking in no time.

Now consider how all that is made. The particular choices of color, the copy/paste style of music production, midi placement, quantization, word choices, etc. - it all leads to perfection.

Let me reveal though - human touch in arts is real. I want to use the example of the interviews that duped me earlier. A couple of these folks used AI while on the call with us. It was listening, guiding them into the answers. It told me exactly what I wanted to hear... and I didn't catch that. Something was off about the interviews anyway, but one of them made it to another round, and AI couldn't have them read code on a screen.

All of us are capable of creating, finding an answer, and solving problems. That's something you can replicate with ease. However, the way you solve the problem, the approach and thought process you took, the unique solution, and the communication... that's all human and you can't replace that.

In my opinion, what makes art great is the individual experiences they provide. If I put an apple on a table, and told 10 people to draw and color a picture of it, I'd get 10 highly unique pictures. 10 unique experiences and several different interpretations of the apple. If I needed an "intro song" to a video I was making, and I said I wanted it to be groovy metal, you best bet if I gave that request to 10 folks, I'd get 10 very unique tunes to listen to, each making me feel a different way.

Despite all the different viewpoints, it's often the imperfections that make an art. When you hear a song from 1975, you'll hear every little time the guitarist moves his finger across the fretboard and that string scratch comes through. You'll hear a tonality difference between a two guitars playing the same riff. In fact, in a lot of older recordings, you'll often hear mistakes that make the music good. Come back to today, and you hear none of that. In fact, it's all perfectly timed, perfectly recorded, and perfectly tuned. There's something missing there.

AI just can't replicate that. It will fool people. It will calculate great things. It just isn't human though. The connection we get is from human to human interaction. Created art to a sense. My hope is that we don't replace it. I personally think we need it.

Final Thoughts

No, you aren't being replaced by AI in 2026. AI will need to be in nearly everyone's toolbelt though. It may be the factor that raises the bar at so many jobs. It will make a lot of things better, but also a lot of things more difficult.

Part of that... is just us. Laziness, evil, abuse, etc. - we always find ways to ruin something good. Let's just hope it doesn't get awful.

Well... there's my long winded opinion on AI.