Category Archives: musings

I Dredd to think…

I’m writing this from one of my favourite cities in the world.

For me Hong Kong is the epitome of where urban dwellings meet the jungle. Tropics growing into brick, buildings built around ancient roots. It feels like a chapter from I Am Legend, except we’re all still very much alive.

The subject of technology outgrowing humans is rife here, I’m collaborating on a project with a friend and it’s come up time and again as this city and the surrounding majors struggle to adopt innovations in a way that resonate with the humble origins the culture is built upon.

This week has left me reflecting on whether sustainable thinking will lend to mega metropolises, or whether global urbanisation will mean that rural areas will get left behind.

As transport becomes more sustainable, more affordable, ever faster and more connected, will it conflict with the debate around borders being reimposed and trade restricted back to regions again? Take Brexit as an example closer to home; if we leave the EU and customs barriers lead to effective restriction to start with, will we note the stagnation that will quickly follow so that when international exchange comes to a standstill we won’t notice the daily conflict?

On a related note but different angle, technology has allowed that we all become producers. Open SDK’s and API’s, 3D printing and crowdfunding all mean that rather than brands lending to personalisation, self made and personally tailored will be the next movement. Will brands play a role in enabling this or will they move to a protective stance on their IP?

I like to think that all my favourite cities are characterised by diversity and as such we will continue to encourage flexibility across the globe, joining forces to build reliance against what could otherwise be a tough future.

But as technology surges with intelligence at it’s core, the biggest question I am hearing repeatedly from all over is whether global cooperation and unlimited interaction will evolve to one diverse world where we see and share everything, or whether the future take us back to the past?

Will Mega Metropolises mean just Mega blocks and Mega highways? Will we live in a Mega City One? Will it be more Skynet than that? So many film references and a tad Dredd I know, but I believe the next few years will impact this more than we care to acknowledge.

I judge you not.

judge-dredd-on-screen-future-still-up-in-the-air-01

Advertisements
Tagged , , , ,

“Our business is infested with idiots who to try to impress by using pretentious jargon.”

So said David Ogilvy.

This post is born of a conversation with a colleague asking what I thought the buzzwords for 2016 might be. I didn’t have an answer on the spot as I hadn’t heard any on my first day back (traveling around Morocco for the festive period is a good way to escape jargon btw). 

Then I heard this word three times this week so I guess it’s on the list… (drumroll)… MARCHITECTURE. 

Wyyyaaaatt? I hear you say. The first time I heard this I rolled my eyes, slid down my seat and promptly left the room. The second time yielded a similar result, the third… well I guess I had to make sure I could try and nip it in the bud sooner rather than later. 

So, ‘Marchitecture’. It’s apparently what happens when a marketing department and a technical department get together. In the decade or so I’ve been making techy, digi stuff for marketing this has just been ‘integration’ but hey, apparently we need a martech strat and a martech stack now.

Don’t get me wrong, marketing and technology should absolutely weave together and it’s not easy to compile an ecosystem that combines content and functionality with a robust framework in place. But why must we coin it so? It really doesn’t help the rep that we marketeers have to keep coming up with these terms.

My mild rant is affirmed by the hissy fit that autocorrect threw at me throughout the writing of this post. 

J-Jargon-cartoon2

Tagged ,

Left Brain, or write brain?

For years scientists summarised ‘memory’ as an intricate part of the brain much like an ever expanding filing cabinet, a neural super-computer if you like, but in fact our memory is a brain-wide process. For example, driving a car is reconstructed from several areas; operating it lies in one section, getting from A to B in another and the ‘oh crap that guy just carved me up’ from yet another.

In this intricate system, we have several areas of memory; sensory, which lasts a second, short-term working memory, which last up to a minute and long-term memory, which lasts a lifetime (unless you’re a goldfish). Long-term memory then splits into explicit (conscious) and implicit (unconscious) and that then continues to break down into episodical, procedural and semantic memory systems.

So with all that in mind, who’s to say we couldn’t slip an updated folder into the filing cabinet and start to amend our memories, enhancing them, through neural prostheses…

Neural prosthetic devices are designed to provide artificial reconstruction of neurone to neurone connections where deterioration has occurred, so what if there was a safe way to insert these ‘memories’ into our neural architecture? Could we ‘remember’ that we’re fluent in a language, play guitar or know how to save a life?

If we could work out how to create the file to save it, then could we implant it as a memory? If we could create a common code that works with the electronic impulses in our brain and is understood through computer algorithms, could the future integration of memory be possible?

Gets the cogs pulsing doesn’t it?

data-brain_shutterstock

Tagged ,

In the beginning

We simply had our imaginations.

Then Thomas T Goldsmith Jr and Estle Ray Mann came along with the first interactive game, the ‘Cathode Ray Tube Amusement Device’, developed in 1947, renowned for their simulation skills and not their wordsmithing, you’ll be glad to know.

Soon after, we saw a burst of simple interactive programs such as; ‘Mouse in the Maze’, ‘Bertie the Brain’ and Alan Turing’s ‘Chess’ capable of computing two way problems but not complex algorithms. This was shortly followed by ‘Spacewar’ in 1962; a two-player game where you try and destroy each other’s starship… arguably the first true video game, it took around 200 hours to code and was done by some students at MIT.

Where am I going with this?

Well, if we fast forward through the Odyssey’s and the varying intergalactic games to the Atari release of ‘Adventure’ in 1980 where we saw text adventure visualised, albeit crudely, in a plethora of dragons, monsters and sword slaying, through to the help of RAM and better joysticks into the world of Sega Dreamcast and NES where imaginary friends like Sonic and the Super Mario Bros helped us through the 90’s and into today, you’ll see my ramblings are leading to a pertinent question…

In a world where we now have technology that scans brain activity to read our minds, technology that creates worlds that don’t exist and technology that maps us to our surrounding climate, how long before we live in our imaginations, in a virtually real world.

Or, what if I’m already living in it and the world we think we live in is actually virtually imagined.

Wait, what…?Living inside my head

Tagged , , , ,

A confession

I have a fairly solid understanding of all things digital, ask me a strategic question around marketing channels for various business objectives and I’ll be able to give you a fairly sound piece of advice or opinion based on insights and learnings from years of experience. Ask me about implementing that strategy and I can give you a break down on the fly of key considerations. Ask me to analyse the risks and I can outline these for you based upon the many times I’ve got things wrong and learned from it.

Ask me to do something technical like ‘make a laptop work’ and that is another thing entirely. Ask just about anyone I work with, I am the closet blonde in the office, sensible on the outside and technically illiterate on the inside (says something in an office with over 500 people).

The proof was in my ‘faux pas’ this morning when I managed to change every single piece of software on my laptop to be operating from within Picture Viewer. That’s special isn’t it? It managed to evoke the following quip:

‘In the 12 years I’ve worked in IT I’ve never seen this done! That is quite an achievement.’

From my Knight in shining armour, Andy, who came to save the day (again).

I’m so proud of this achievement I just felt I had to share it with you.

I’ll be back soon with something more insightful…

 

Tagged

If you don’t make mistakes, you won’t make anything

Being wrong is ok, failures and false starts are a precondition of success.

Some of the most successful people and companies I know are so, because they allow themselves to fail. Edison made over 200 light bulbs before he made the one that worked but he learned something from every mistake before reaching the end result.

It is the wrong turns in work and life that define us, risks are a measure of who you are. Those of us that take risks will end up leading more fruitful lives. Knowledge is built upon things that happen but if you stop taking risks and only keep the same knowledge it will quickly become unoriginal, and lets face it, who wants that label.

So, go on, get it wrong! It’s the right thing to do.

‘Fail. Fail again. Fail better.’ Samuel Beckett

Tagged , , ,
Advertisements