AI killed the user interface. Enter the inter-userface

Part 3 of Jason English’s “Life at the End of the Funnel – A Technology Bard’s Tale”

JE Cortex Apr 22 2026 Inter-UserfaceAfter making TV commercials and award-winning CD-ROM games for 4 years, my first little startup Museworthy got squished by a lawsuit and my credit cards were maxed, so I started working at the first web agency.

Ok, Eagle River Interactive maybe wasn’t the very first, but we’re talking 1995 now. By offering decent pay for creative work and a funky open workspace in the West End, they sucked up most of the available designers, developers and interactive talent in Dallas at the time. 

I had written a few corporate videos in addition to the games, so I was pretty good at capturing business customer stories, and I could doodle in Aldus FreeHand and Adobe Photoshop, and Macromedia Deck and Premiere 2.0 for mixing audio and editing video. I could make a simple HTML page in either a text file or PageMaker.

This “special set of skills” as Liam Neeson might put it, set me up to be an early Information Architect – not exactly a writer, designer, developer, or project manager. Just someone who could talk to clients and translate business and brand intent into interface design and functional software requirements.

Those early usability days

Since our users were on Netscape and AOL, our web interfaces were about as slick as they could have been at the time, given small screens optimized for the lowest possible internet bandwidth, because most people were still on dialup or DSL connections.

For clients with a real budget, we could use A/B testing website designs in review sessions or focus groups. There was even a usability science lab down the street where you could track how people’s eyeballs moved around on your prospective webpage design. We were all in our twenties and for a good time, we could stay after and play Quake on the company T1 connection.

For us, the real “killer app” was Google. You’d just type in what you want in the search bar, and bam! It would usually show pretty relevant results, rather than random links like Yahoo!, and no stupid flashing ad banners. [This was long before the search engine’s enshittification, as Cory Doctorow dubbed it.]

Also cool, Dallas acquired the Stars hockey team from Minnesota, and being my favorite sport to watch, I went in with my dad on season tickets. We could easily meet up at the office and walk to Reunion Arena. At one game on October 25, 1996, I was bringing down a couple beers during the start of the second period.

Darryl Sydor, my favorite Stars player at the time, slapped a one-timer from the top of the circle and it got redirected toward seat J7. They didn’t have corner netting back then. My dad said “Aw, I missed it.”

I didn’t. You know how in Looney Tunes cartoons, when Sylvester gets bonked in the head, it makes that kind of sheet metal sound, followed by tweety birds? It sounded exactly like that.

Agentic AI talk is cheap

Fast forward 30 years later, and AI has replaced the software user interface with a text field again. Instead of looking for visual cues and clicking around, we can just type in a prompt, or talk to it, and ask it for whatever we want.

AI is finally here! It’s like that “computer” on Star Trek, where you just say “Computer! Give me an analysis of the inhabitants of this planet.” 

Except, without the egalitarian post-money society where everyone has equal rights and opportunity, and their basic needs are met, and nobody has to live next to a huge, noisy datacenter that sucks up all the energy and water to produce fake videos and replace annoying workers.

But there is a social upside. Almost a third of U.S. adults have reported that they have had a romantic relationship with AI, compared to just 20 percent of high-school aged teens.

Introducing my next biggest release: The Inter-Userface.

The Inter-Userface is a new agentic hybrid AI collaboration platform that provides a contextual conversational metadata reference mesh, (or AHAICPCCMRM).

There’s no screens, or buttons to click, though it already has an MCP server. You just ask your agent to talk to someone else’s agent, then when that doesn’t work, you do the rest.

You conduct a discovery process, looking into your device’s identity lists and email history, to see if there is a 9 or 10-digit code, or “phone number” to dial. If it isn’t there, you can still send a lightly encrypted payload over SMTP to inquire about the call code.

You can bring your own model, and ask it to translate your request into a formal EML query you can use, which will certainly increase your chance of getting a meeting.

“Dearest Bob – I am just reaching out to touch base. I will happen to be in your area next Tuesday. Let me know if you would like to talk in a physical location, with a selection of whatever food or beverages are acceptable to you. Here is my phone number. – With warmest regards, JBob.”

Then sit and wait. It might take a couple seconds to get a response, or forever. There should always be a human in the loop, because everyone skips out on their own Calendly invites.

A pair vibecoding collaboration exercise

So whether you are dropping by an office, or a coffee shop, here’s where the Inter-Userface becomes incredibly powerful.

After a handshake, you initiate an ultra high-speed peer-to-peer data exchange in EML [Author’s note: English Markup Language is an open source framework I fathered before joining Intellyx in 2017].  

EML message brokers for both parties identify and pass payloads of “words” with automatically assigned semantics by an organic computer that contains an infinite number of parameters, sort of like a “fuzzy logic” GPU.

While the data exchange may actually seem slow, there is a wealth of underlying telemetry data for this “conversation” including visual recognition of physical cues and speech tone filters for the golden values of intention, attention and emotional perception. These valences reveal underlying patterns, hopefully moving the exchange forward toward mutually achieving shared objectives.

Private or proprietary data shared in these scenarios should be encrypted as secrets, or EML whispers. These must be hashed and only tangentially referred to to avoid unauthorized sharing unless you are in a securely airgapped location such as a park bench.

The Intellyx Take

Now, I know this inter-userface can be a little daunting, the first time you try it. Since the pandemic, we have become accustomed to remote operations. But don’t worry, we’re hard-wired for it.

Some companies are even taking this P2P meeting process into a many-to-many format, foregoing a centralized office budget for annual or quarterly team meetings at desirable offsite locations, to take advantage of the agile camaraderie and multiple perspectives of a high-cardinality in-person conference.

EML will become a lot less popular someday and likely get replaced with a more commercially friendly protocol like CML, but for now, most people you can contact in the real world will still support EML integration.

Anyway, TL;DR. I know nobody will read this column. You can go talk amongst yourselves now.

 

Copyright ©2026 Intellyx B.V. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. No AI resource was used to write this article. Image source: Adobe Image Express with edits by author.

SHARE THIS:

Principal Analyst & CMO, Intellyx. Twitter: @bluefug