Skip to content

theCampel/TimeCapsule

Repository files navigation

Time Capsule Essay

The Future?

On the 23rd of August, 2023, I predicted what a day in my life in 2030 looked like. I wanted to write a time-capsule, capturing our expectations of the future, in a verifiably dated form. It looks like this.

I hope you enjoy.

- Leo 😉

Edinburgh, 26/08/2030:

Ding! I looked up and saw the “seatbelts” sign, now brightly illuminated. The bell’s ding barely pierced the slow melodic drum of the airplane’s engines. My eyes were irresistibly pulled to the view beyond the 737's quaint windows, Edinburgh’s landscape unfolding rapidly before me. As it morphed from greenery to roads to runway, I couldn’t help but reminisce about my student days here, taking this exact flight every three months. It occurred to me that my first visit to Edinburgh was over a decade ago. The familiar landscape masked a new world, one technologically redefined.

I was wrenched out of my daze by the twenty-something next to me. Her fists clenched on the armrests on either side of her, betraying her unease with flying. With each turbulent thrust of wind, her stomach dropped lower and lower. But most intriguing to me was the pair of ski-goggles-esque headset that adorned her face. I recognised them instantly as Apple’s latest Visions, an XR headset that allowed her to seamlessly mix between the real and virtual world. Her pair was even customised to match her outfit.

Clearly a fashionable Gen-Z, her headset showed she was “engaged”. An animation of blue and purple waves of light, akin to the Aurora Borealis, danced atop her goggles. The “engaged” animation implied she was in full Virtual Reality, completely shut off from the real world. I wondered where she was virtually at that exact moment.

Gaining inspiration from the animation, I imagined she was in a peaceful arctic retreat, gazing at the beautiful spectral show of the Northern Lights. I realised it was a lot more peaceful than being in a flying tube of metal hurtling towards the ground at a concerning velocity.

But no virtual trickery could disguise the lurching of the plane as its tyres landed - pummelled - onto the runway. No matter how technologically advanced our society became, I knew a turbulent RyanAir landing was something I could rely on.

My attention was drawn back to the girl. She’d switched back to “Present” mode, as I now saw her eyes through the goggles. With the ability to now interact with the real world, she reached for her bags, pausing momentarily, eyes dancing around the screen. She was probably catching up on all of the notifications she’d received while flying. My theory was all but confirmed when I saw her thumbs methodically tap specific fingers - the modern way of “clicking” for these goggles.

While passengers began to disembark, my mind wandered back to the girl beside me, speculating just how profound her aversion to flying might be. These XR headsets are notoriously unreliable on planes or cars, but if she was willing to use them regardless, her situation must have been quite dire. Wow. Flying must have been extremely difficult for her before this technology.


As I exited the airport, I popped on my own pair of Visions and called an Uber. “Driven or Autonomous?” a prompt asked me. It only then dawned on me that the Lothian Council had granted permission for self-driving vehicles to roam specific streets the month prior, a full year after my first autonomous experience in London. Wanting to play my own music, I selected Autonomous.

As I waited on the bench for my ride into town, I switched my Visions to “Engaged” mode and replayed the Lakers-Bulls basketball game that happened just the night before. As last night’s experience loaded up, I was completely and suddenly transported to the United Center Basketball Stadium, any traces of Edinburgh erased. I sat courtside, with celebrities sitting next to me, all cheering at each shot, rebound and block. Animated fireworks exploded above me after each three-pointer. A floating scoreboard danced in my peripheral, coming into focus with a single glance. The sound of the squeaking sneakers or cheering supporters around me only helped me further immerse myself in this world.

As technologically impressive as it was, I still knew the result of the game. The Bulls were on their league-winning streak. This game was no different. It was a nice change of pace to them only a couple of years back, when they struggled to gain any notoriety at all. Because of the game’s unsurprising nature, I whipped up a single 2D screen playing Sunday’s F1 race to the corner of my eye. Ironically, it gave me joy to see Redbull, a once outrageously dominant team, struggle to break P5. Just as McLaren crossed the finish line, the real world faded back into view. It was my Visions alerting me that my Uber had arrived, with a convenient green waypoint indicating which of the many Toyota Priuses was for me. As I entered, my Visions authenticated with the empty vehicle, ensuring I was the correct passenger for the transport.

Almost instantly, my Visions wirelessly connected as a guest to the car’s modern Apple CarPlay system. The Miseducation of Lauryn Hill began playing, but I quickly dug through my Spotify and resumed BBC Radio 4(U) on my ride into town.

Presenters Emil and Vicky bantered about how they spent their weekend. Emil began a long winded rant about how he tried for hours to get some money from some class-action lawsuit, only to end up getting £2. Vicky, with her soft Cardiff accent, began cackling up at this. A bit more light banter ensued before they mentioned they were going to continue the music with a throwback to the mid-2010s. As the sound of drums began to play, I couldn’t help but crack a smile. My favourite song at the time, Love Is Only a Feeling by Joey Bada$$, began playing. Of course, this was no mistake.

BBC Radio 4(U) was a custom radio station I created on Spotify, shaped to my tastes and whims. The presenters, Emil and Vicky, were two AI radio hosts I modeled off of their real-life BBC Radio 1 counterparts. Fortunately for me, real-life Emil and Vicky opted to make their voices and mannerisms free for public use. This meant I could listen to a radio show with charismatic hosts who played the exact music I enjoyed. These AI hosts spoke the perfect amount (according to my taste anyway), cracked jokes and always spoke about things that were relevant to me, whether I knew it or not. It was like magic, save for their occasional hiccup in the AI-infused humour. This was especially notable when the tones of voice didn’t line up with the punchline, but it was a small price to pay for music I always enjoyed.

I rummaged through my bag for my iPhone 20. I used my Visions more often, but I occasionally enjoyed the more tactile experience of typing on glass. Also, I was about to catch up with Pi and the experience was hit or miss when there were no microphones nearby.

Like a lot of people my age, I had an AI personal assistant. But while many renamed theirs, I kept the default name; Pi. According to its creator, an AI studio called Inflection, Pi stands for Personal Intelligence. I customised everything else about it so that it could serve me the best possible use, but I always opted away from changing its name. I thought it would be the final barrier I broke down before completely anthropomorphising one of the things I spoke to so often.

“Pi are you there?” I called out.

“Hey Leo.” a South Dublin voice, distinctly feminine, responded gently. “How can I help you?”

“Catch me up would you please”

“Sure thing. Your colleague Andrej scheduled a meeting for today at 2 p.m. I know you’re on PTO, but he mentioned in the invite it’s about trialing a new product. I know you enjoy those trials, so I haven’t responded to the virtual meeting invite yet. Would you like to go?”

“I’d love to. Book a nearby co-working space with a separate meeting room that I can take the meeting in. If it’s less than a twenty-minute walk away, don’t bother booking a taxi there.”

“Great. I’ll get on that” she asserted. “Also, Google is settling a class action lawsuit with the citizens of the EU under the EU Ai Act for how they trained LLMs on user data. I’ve scanned the details of the case, and you’re eligible for some compensation. I can't guarantee it’ll be much, but I can get it sorted. Do I have your consent to provide your banking details to exclusively the relevant parties so that you can get paid?”

“I, Leo Camacho, give you consent to share my banking details.” I asserted. I always found it amusing how professional you had to be with Pi when it came to data sharing. She wouldn’t accept a simple ‘yes’ or ‘no’. “Anything else?” I continued.

“Yes. Last thing. I saw on your FindMy that your sister is up from Manchester in Newcastle. With the new LNER train in operation, her journey up from Newcastle to Edinburgh would only be thirty minutes. Do you want to arrange to do something with her? Keep in mind you're meeting Tom and Gregor at 8 pm.”

“That’s a great idea Pi. Change your settings to encourage me to focus on meeting my friends and family more often by making more offers like those. Now send the following voice message transcript to Manny: ‘Hey sis. Saw you were back up in Newcastle for the week. Want to come up to Edinburgh for some dinner? The new LNER train is finally up and you’d legit only take thirty mins getting here. If you’re down let me know and I’ll get Pi to book your tickets and reserve a table.’”.

“No worries Leo. Also, while we were talking I sent a reflection of myself to talk to Google’s Legal Ai. You were entitled to £50. I took 20% commission and deposited the rest into your account. Also, I spoke with the AI of The Melting Pot, a local co-working space, and you’re booked in for 2-2:30. The transcripts of both conversations have been sent into your inbox.”

“That’s perfect. Thanks Pi.”. I quickly scanned over the conversations Pi had with the AIs but found nothing of interest. I chuckled at how Pi actually tried arguing I was entitled to £75 to Google’s Legal AI, but to no avail.

This was one of my favourite features of modern-day chatbot assistants: their ability to talk to other AI assistants and their accountability in doing so. My Pi was able to send a copy of itself, with my interests in mind, to talk to (Ai or not) representatives of companies. It could report back to me with its findings, with its conversation cryptographically recorded and signed. This was the single feature that made me feel most like I was living in the future. Of course, it wasn’t perfect. Not all businesses had an AI representative Pi could converse with. Some that did severely limited the abilities it had to simple information providing. That meant that no matter how capable your Pi was, you’d have to manually go onto the web to do what you were looking to do. But because Pi, and other similar AIs, prioritised businesses it could talk to, there was tremendous economic incentive for local businesses to develop an AI. But since that was an expensive endeavour, OpenAi, Inflection and many other AI studios sold just that feature…

Regardless, I appreciated the £40 for doing nothing. It once again became clear to me that Pi was the single best investment I could make. Pi costs me about £50 a month (the price varies based on usage), but between the money she earns me and the time she saves me, she’s easily paid herself off multiple times over. Once again, it shocked me that Pi, or Ai’s like her, were still more unpopular than popular despite the immense use they served.


A prerecorded voice rang out through the Uber letting me know I arrived at my location. As I exited the car onto Prince’s Street, my stomach dropped when I realised what I’d walked into. Edinburgh Fringe 2030, the largest arts festival in the world, had returned for its yearly occurrence, the thick swathes of tourists along with it. I moved to London in July 2026, so this was my first Fringe in over five years. I made a mental note to ask Manny if she’d like to watch a show, just like how we did at my first-ever Fringe. Grimacing, I began my journey through the swarms of crowds to my accommodation for the week; my mate Gregor’s flat.

A brisk 15-minute walk later, I arrived at Gregor’s flat. I let myself in using an old-school combination lockbox attached to the wall. Climbing the stairs I saw a large white and red box, riddled with cables, hanging on the wall. Curious, I inspected it closer. Of course. The iconic Tesla logo told me everything I needed to know. It was the building’s communal battery. Gregor’s building was a part of the new Net-0-Carbon initiative, which meant it utilised not only hyper-efficient solar panels and bladeless wind turbines but also could sell excess electricity back to the grid. This technology has existed for years now, but such green initiatives are becoming more and more common in cities due to aggressive legislation from the UK government.

“Neat” I murmured before climbing the rest of the way.


“Gregor?” I called out into the flat. No response. He didn’t share his location with me, but I presumed he was at work. Starving and looking for some time to kill, I decided to cook a new dish I’d never cooked before. I scanned the HelloFresh app for ideas and landed on a timballo. After a quick trip to the Sainsbury’s downstairs, I was ready. Bar the fact that I had no idea how to cook a timballo.

Naturally, I adorned my headset and asked a HelloFresh bot for some tips for the process. As I boiled the eggs, I hastily threw up some timers and instructional videos, pinned to my peripheral. I’d already begun boiling the eggs when I saw how long this was actually going to take. One of the things the bot failed to mention was that this was a three-hour affair…

An arduous journey, and three hours later, I had produced a timballo, thanks to a constant baking “buddy” next to me. I was appreciative of this new technology. I was never the greatest chef, but I was proud of the meal I’d made myself. A meal I would never have tackled without the help of a pro (human or robot) next to me.

I sat down to enjoy my lunch before a news alert came on screen. I watched, uninterested, as a representative of the council alerted me that as it was the Fringe’s busiest week, many roads would be closed down, buses’ timetables altered and diversions implemented. The diverted routes only marginally affected my plans, but regardless I scanned the alert for a cryptographic signature indicating it was a legitimate post, something I learned was crucial to do.

A year prior I saw a BBC News report with Rishi Sunak where he declared the City of London was removing Uber’s Autonomous license. Sensing impending financial danger for the ridesharing company, I hurriedly sold half of my Uber stocks at a (marginal) loss. I only realised the day after that the video was a deepfake, designed to send Uber’s stock price to the floor. The video was indistinguishable from a legitimate report, with the PM’s voice and mannerisms a perfect replica. The only thing it was missing was the cryptographic signature indicating it was real. In a rush to sell my stocks, I missed the single glaring red flag there was.

I finished my lunch, put the rest in the fridge (with a sticky note saying “Eat away!”) and made my way to the co-working space. With ample time to make it, I decided to take a detour through George Square, my old University campus.


Immediately, I was shocked by how little it had changed. Granted, temporary buildings had been erected for Fringe, but the University architecture still had its old charm to it. Post-grads still roamed the campus with a sense of impending deadlines (read: impending doom). Undergraduates wouldn’t return for another few weeks, but the campus didn’t need them to have the characteristic hustle and bustle to it. Despite countless technological advances since my time here, it was strikingly unchanged.

There were some subtle contrasts to when I was a student. As I looked at the students individually, I could see some were discussing their dissertations with their supervisors, while others were having their ideas and theories challenged by some AI educators. I’d have loved to have seen my old lecturers as bots. What I would’ve given to have had the tools these students have during my first year at University.

I made the regrettable mistake of wearing my Visions as I passed a Tesco on my walk. Almost instantly, personalised offers began popping up around me. Floating banners exclaimed that I could get a meal deal for £3.50! An echo from my past reverberated in my skull, back to when I almost lived off of those meal deals. I just ate and wasn’t hungry, but the ad was incredibly enticing.

The shove of a stranger took me out of my hypnotic trance. Frustrated at what had just happened, I turned off my headset and rested it around my neck. The bombardment of ads almost made me buy the last thing I needed right now. Irked, I moved on.

I arrived at the coworking space with 10 minutes to spare. A bright red door lay wide open, held there by a door stopper in the shape of a small dog. Within it, a pair of closed sliding glass doors. I looked around, confused, and saw a badge reader icon. I tapped my iPhone’s “identity card” to the reader, which was pleasantly accepted with a satisfactory ding. Like magic, the doors slid open.

Unsurprisingly, The Melting Pot was a typical co-working space. The walls were brightly painted, illuminated by a staggeringly large skylight. Surprisingly, the colours worked well together. The furniture was gratuitously oversized and there was greenery at every possible turn. It had an unmistakable buzz to it as employees of countless different companies all worked, tapping expeditiously on their keyboards. I scanned past the common desks for the meeting rooms. I was happy to see The Melting Pot had a 3D meeting room, which allowed me to meet with my colleagues in London and San Francisco as avatars, not as a simple 2D projection. I plugged my MacBook into the wall and adorned my Visions. I swapped out the interchangeable battery with a fully charged one and prayed it would last me the whole meeting. Apple Visions were many things. Long-lasting was not one.

“Hey Pi, catch me up for this meeting” I asked. As Pi recounted details and information I might find useful for this tech demo, I silently whispered gratitude at my current position. I was an engineer working on Alignment at OpenAi London. The general public conceded that Artificial General Intelligence, AI systems that are generally smarter than humans, was only a matter of years away. My job was helping ensure these models behaved in a way aligned with human intentions and values. I knew I was doing important work. Not to mention, I loved it.

As the virtual meeting space loaded up in my Visions, I saw a semi-realistic 3D rendering of Ilya Sutskever, the co-lead of Super Alignment at OpenAi. Basically, the main guy in charge of ensuring these models don’t end up hating us. He began speaking and I wondered if he was speaking English or Russian. His English was near perfect, but I knew his Russian came more naturally. I, of course, wouldn’t be able to tell, as Microsoft Teams’ new auto-translate feature automatically converted each other’s speech into the language most suitable to the listener. Not to mention, it would do it with our voices. Curious, I went to ask Pi but realised she was disabled for the meeting. The meeting contained proprietary technology, and so by policy, we weren’t allowed third party bots present.

As the meeting concluded, I walked away with a grin on my face. The technology we trialed was the culmination of my 6 months working there at OpenAi so far. We developed a system that allowed GPT-7 to iteratively supervise more complex models while giving us confidence it remained aligned.

I left The Melting Pot and walked towards Bristo Square to go see my sister. The walk, a solid 20 minutes in length, flew by as I reminisced on all the good times I had in each passing location. I was knocked out of my nostalgia when I saw a familiar face sitting on the steps of the square. My sister sat on a decorated bench, a book in hand, admiring the tipsy tourists ordering another round of pints in the sun. I ran up to her and gave her a bear hug. It had been almost 6 months since I last saw my sister in the real world. We virtually chatted frequently, but it wasn’t the same. As we caught up, I mentioned we should go see a show.

“Yeh absolutely let’s do that” she agreed. “But nothing with chatbots or where we need our Visions. They’re kinda crap”. Quite frankly, I agreed. Some Fringe shows heavily involved the use of technology. Some had interactive virtual features that could only be accessed with a pair of Visions or similar headsets. Other ‘One Man Shows’ actually had Ai voices serving as co-stars. I’d watched recordings of both of them on YouTube and clips on TikTok and just considered them too gimmicky to be worth it.

“Sounds like a plan sis. Ready for some food? I’m starving.”

“Let’s go” she replied. As we walked into our favourite Mexican joint, El Cartel, where lively chatter and sounds of laughter reverberated, I powered off both my Visions and iPhone. Sometimes, it was better to disconnect from all of the technology around us and just be present.

Leo, 23/08/23

Analysis:

The story you just read featured technology I (optimistically) expect to be available in 2030.

First, some background. These predictions, and in turn the story, were heavily influenced by two enormous technological advances that happened only in the last year; Apple’s Vision Pros and Generative AI. If you asked me to write this essay a year ago, I would have had a wildly different imagined world. I’ll not find out if it would have been more or less accurate for a while now…

Apple Vision Pros:

One of the main features of the story was Apple’s Vision Pros. They’re the combination of Virtual Reality (VR) and Augmented Reality (AR) headsets, with a dial to switch from the augmented real world (imagine you’re in the real world with a YouTube video in your peripheral) to a fully virtual one (where the Youtube video is now playing like a cinema and all around you could be the arctic).

I don’t imagine them to be wildly different from the product they first announced only 3 months ago (a 3D AI-generated image is available in the repo of what I expect them to look like in the future). The difference is I expect them to be wildly more popular by 2030. By then, I expect the headset, and similar headsets, to be where the iPhone was in 2012. Society was only halfway through the transition from actual caller phones (think BlackBerrys) to smartphones. In the same way, most people will still be opting for their iPhone for most use cases in 2030.

But the early adopters (like myself in 2030) and the early majority will utilise the headset for what it’s capable of. I’ve long believed this technology was the future (thanks to Ready Player One), but seeing one of the tech giants throw their glove in the ring only further solidified my belief.

Thanks to the vibrant developer community that Apple has developed, there will be countless developers looking to port their apps onto the headset. It’s not clear to me yet whether or not the apps will be a glorified way of watching content or whether the devices will actually be able to interact with the world. Ever a tech optimist, I predicted it would be able to do both, and it’s up to us to choose how we use it. Be it practical (the Uber app interacting with the world and highlighting which car was mine), purely entertainment (the basketball game) or a healthy mix (the instructional cooking videos and timer as I made the food). Ultimately, this will be up to Apple and the developers to decide what’s possible with these devices. I hope by 2030 they’re an actual tool (like how phones are) and not just glorified content consumption devices.

Generative Ai:

Generative AI has been around for a while. The suggested word feature above our keyboards on iPhones? Been around since 2014. But the mainstream only appreciated its value when, in November of 2022, ChatGPT rose to prominence. With a simple chat interface, anyone of any profession and any skill could ask a computer anything and it would respond well enough.

For technical reasons, I won’t get into the why here (check GPT - The Nerdy Details page), but by 2030, virtually every small and big platform alike will be able to efficiently and immediately generate high-quality content of all types, relevant to anyone or any query. The importance of this cannot be understated.

What does that mean?

For businesses:

In 2023, most businesses involve operations where a human takes an input and generates a response relevant to the details of the business. This could be a human taking in emails and generating a response relevant to the details of the business. Or the ads manager taking the store’s broad demographic information, generating and delivering ads relevant to the mass.

Generative AI is set to supercharge businesses’ ability to create content, down to the individual level. For example, a store could utilise Generative AI to respond to individual customers’ emails, relevant to the exact customer’s needs and the store’s details. Alternatively, it could deploy a bot, designed to answer customers’ individual queries or even perform basic actions on behalf of the business (E.G book a table). Or it could deliver custom ads to individual consumers, relevant to the specific purchases we’ve made in the past. In short, Generative AI will help businesses dramatically enhance their ability to connect with individual consumers.

For consumers:

Generative AI for consumers is what I’m most excited for in the future. The possibilities really are endless here. Much of the generative technology I documented exists today.

Content Generation:

You can generate a conversation between two robots (with GPT-3.5, Bard, Llama 2, Claude 2…) and have a speech synthesiser read it out to you (Speechify, Google Cloud Text-to-Speech, ElevenLabs…). You can even have an (unrealistic) 3D avatar say those words (DeepBrain Ai)! The problem? The tech today just isn’t that good. Or it’s too expensive and difficult to make for it to become mainstream. But what’s to stop someone combining all these technologies when it becomes cheap enough (before getting bought out by Spotify of course)?

Chatbots:

Again, chatbots have been around for a long time. Recently, they’ve been supercharged by utilising a new type of algorithm called The Transformer (Check GPT - The Nerdy Details again). While they’re incredibly advanced today (GPT-4 helped me write this essay - check out our conversation in the repo), we’ve only scratched the surface of what’s possible with these technologies. For them to serve as much utility as I described in the story, we need two things; first, a common framework that allows chatbots to interact with the world. The second thing is to concede inconceivable amounts of personal data to these companies.

The first problem is that of a technical one. How would a future version of Pi book a local coworking space, train tickets or a table for dinner? I believe companies like WeWork, TrainLine or OpenTable will develop plugins that give major chatbots the ability to interact with their services (in a similar way that OpenAI has a plugin store). A standard may even arise over how to code these plugins.

The second one is more of a societal problem. Data.

Data - The Caveat:

Generative AI will only get more and more capable. But for it to become more useful, it has to be more personalised to us. That means feeding it information and data about us so that it can produce the most relevant content.

Companies (like Tesco) are salivating at the opportunity that’s opening up in front of them. Tesco already collects all the information about what you buy and how often you buy it (you think a Clubcard is just so that you can get good deals?). Soon, they’ll be able to deliver ads specialised to every individual, enticing them with what they’re most likely to want at that moment.

Imagine a H&M ad delivered to your feed, that goes: “Remember that green jacket you loved? It looks great. You’ll absolutely love this matching jumper. But quick! There’s only a few left in stock in your size. Click here to generate a photo of you wearing it!”.

While the technology will be ready far earlier than late 2030, it’ll be a question of whether or not we’re societally ready for it, from a privacy and ethical point of view.

And what about the consumer? A big part of this story was my AI personal assistant, Pi. It works so well because, like all good PAs, it knows an incredible amount about me. In the story alone:

  • It knew details about my work, my calendar, meeting notes, and the sort of work I enjoy most
  • It knew my sister’s default location (i.e. home) based on my FindMy
  • It knew I was a Google customer for years
  • It knew my location 24/7, contacts, and countless other micro-nuances.

Needless to say, the information we’re set to give up in the future makes the Google or Cambridge Analytica of today look like child’s play. But do the ends justify the means? I (in the story) clearly find value in giving up my data for a personal assistant. I even pay for the privilege. Currently (2023 Leo speaking), I’m okay trading all that information for an assistant who can do what Pi does.

Of course, while it will be technologically possible, we could never get to that future because of privacy advocates like Apple (the outlier in Silicon Valley). Apple has long championed privacy-centric products (They made sharing the IFDA [a unique code that merchants used to track consumers across apps] optional, against huge backlash from data scrapers including Facebook). Seen as they’re the hardware basis for much of the technology, they could prevent it. That said, I can see entire services popping up designed to allow you to get the most out of your assistant.

Other Tech Advances:

The rest of the story contains snippets of where I perceive other technologies to be by 2030.

Autonomous Vehicles:

In the story, I take a ride into town in an autonomous vehicle. I think autonomous vehicles are a deceptively challenging problem. We’ve been promised self-driving cars for years now. But Waymo and Cruise are both actively expanding out of Silicon Valley today. If this progress continues, it makes sense major metropolitan areas will see an influx in those sorts of rides. The challenge for those companies will come when expanding into rural areas with less well-defined roads.

Green Initiatives:

The entire world is “racing” towards a greener future. Some countries are faster than others, of course. The UK has banned the sale of diesel cars by 2030 and is actively pushing forward other green initiatives. With the lowering costs of batteries, we could see more incentives to produce clean energy, store it and sell it back to the grid when it’s not needed. This is a technology that’s actively in use today, but it’s typically reserved for houses with lots of wind or sunlight. Government incentives could help push the agenda further afield. Or maybe (read: hopefully), the world will lose the stigma against nuclear energy and pivot towards implementing it on a wider scale.

Education:

Education is poised to change dramatically. I found it hard to incorporate it into the story, but there’s no doubt that it’s due an evolution. With online degrees becoming more reputable, to custom bots (maybe even in the form of your lecture or an industry expert) that can help teach you the concept in exactly the way you understand, education is changing now more than ever. Even still the story is set in seven years, with University (and education in general) being an institution that’s hundreds of years old. It’s not going anywhere just yet. That said, I do imagine the education my kids will get to be wildly different from the one I grew up on.

Conclusion:

Simply, the world is rapidly advancing, far faster than we realise. Granted, it won’t be unrecognisable from today. Some things will never change. But the future I wrote is one I’m excited for. It’s a future where technology is now customised to every individual, enabling us to do so much more with it. It’s a future where technology is an even larger multiplier of (the best of) humanity than it is today. It’s a future I’m excited to help create.

Leo, 23/08/23.

P.S. I tried feeding the entire story into ChatGPT and asking how I could improve it. An error popped up saying my prompt was too long…

About

A written prediction of the future

Resources

Stars

Watchers

Forks