FT Bracken Bower Prize

The Fuzzy and the Techie

Why the liberal arts will rule the digital world

By Scott Hartley

Scott Hartley

The terms “fuzzy” and “techie” are used to describe, respectively, students of the humanities and social sciences, and students of the engineering or hard sciences at Stanford University.

Beneath these light-hearted associations are more charged opinions on degree equality, vocational application, and the civic role of education — opinions that have bubbled beyond the vast acreage of Stanford’s palm-fringed quads and golden hillsides, into Silicon Valley and its metaphorical extension as the economic engine of the economy.

These decades-old monikers have come to represent a modern incarnation of physicist and novelist Charles Percy Snow’s “Two Cultures” — a false dichotomy between those versed in the liberal arts, and those with the requisite vocational skills to succeed in tomorrow’s technology-led economy.

I seek to reframe this debate by taking into account the very real need for science, technology, engineering and mathematics — the so-called “Stem” disciplines — while acknowledging their faux opposition to the liberal arts. These need not be mutually exclusive.

Indeed as we evolve our technology to make it ever more accessible and democratic, and as it becomes evermore ubiquitous, baked into every moment of our life, the timeless questions of the liberal arts become the most timely requirements of our new tools.

Peering behind the veil of our greatest technology companies, it is often our greatest humanity that makes them whole. There are very real, and very relevant roles for individuals of all backgrounds to play in tomorrow’s tech economy. Our problems require fuzzy and techie to come together. We must consider the value of the liberal arts as we continue to embrace our new tools.

A number of recent books have focused our attention on the inexorable march of technological change that many argue puts those without technical ability at risk. A great deal of attention in the discussion has focused on the threat to jobs from increasingly smart and nimble machines.

While the risk of skills-based technological change is clear and present, this new era also offers remarkable opportunities, and not only for the technologically literate, but also for those without traditional expertise in technology — those who know context rather than code.

Technology provides innumerable opportunities for people to create, rather than react passively to change. Those with less expertise cannot only collaborate with tech experts, but also drive the innovation of new products and services. Tools have become modular, and the methods and interfaces used to control and extract value from technology have become intuitive. The building blocks come pre-assembled, and those that don’t already snap together have ample documentation online explaining their assembly.

Today, collaboration between fuzzies and techies is flourishing. Technology company founders are not always the technologists stereotypically portrayed in the media — the computer science wizard and proverbial high school dropout.

They are as likely to come from backgrounds in fashion, education or media, as they are to come from deep technology. They have the charisma and vision to bring people together around their ideas, and they know enough about technology to partner with techies to execute their vision. Their comparative advantage is in their ability to identify problems, and ask questions, not necessarily locate answers.

If machines keep getting better — and they will — humans must become better versions of themselves. The rise of the robots, which has been persuasively heralded, will impel a greater need for our very basic humanity, and our need for empathy and the soft skills. Automation will take away repetitive human tasks that do not require higher-level problem solving. Being a techie is not the antidote to redundancy in tomorrow’s economy; being more human is.

The choice between fuzzy and techie is not a zero-sum decision. Each is vital to the other’s success. They are necessary in equal measure throughout our world, and how we organise it. The new tools allow greater fluency with technology for fuzzies; the advances in machines require greater engagement of our humanity. If you want to know an answer, you’ll ask a machine, but if you want to discover a question, you’ll ask a human.

As we collect more and more data, we must inquire further as to its source, bias, and application. As algorithms come to command our world, we must remember that, even shrouded in mathematics, they are not objective. As screens interrupt our every moment, we must pause to question how we build to maximise our scarcest resource, namely our time and attention.

Data science is a new career path, but we have forgotten about data literacy, knowing the domain, context, and how to ask questions of the raw format. Within big data are small patterns, patterns we come to see when machines help surface information, and humans transform it into knowledge.

Humans are the missing, and essential link in artificial intelligence. They are the translators and the last-mile delivery agents. “Deep learning AI” also requires deep-thinking humans. To develop products that truly engage, we must appeal to humans’ psychology, sensibility, and need. The path forward is an optimistic one, rooted in performing more complex, non-routine tasks, and it is one that requires both the fuzzy and the techie.

Just as CP Snow lamented the growing divide between the sciences and the humanities, speaking of the “mutual incomprehension . . . hostility, and dislike” that had taken hold, we must embrace technology yet not forget the liberal arts that give meaning and explain why, not just how, we build.

Scott Hartley is a start-up adviser and venture capitalist

* * * * * * * *

Blockchain Babel

How management theory can bring clarity to the crypto-craze

By Igor Pejic

Igor Pejic

“Silicon Valley is coming,” JPMorgan Chase chief executive, Jamie Dimon, admonished the banks’ shareholders. Though he avoided mentioning its name, he was referring to blockchain, the technology behind bitcoin.

Blockchain is a protocol enabling distributed ledgers and promising almost instantaneous and nearly free transactions. With blockchain’s help, money and assets can be moved without a central authority.

Validation is performed via a peer-to-peer network that does away with powerful intermediaries that authenticate or settle transactions. And while bitcoin is the first and best-known blockchain, the hunt for the killer application is in full swing.

Following a Cambrian explosion of fintech start-ups in the past five years, today more than 300 of them are hounding the industry that fuels the global economy. And investors keep injecting capital at an unprecedented speed. At fintech and payments conferences, the jeans of Silicon Valley have given way to the suits of Wall Street.

Many start-ups are managed by experienced executives of big banks, while an impressive roster of industry figures is lining up behind initiatives such as the R3 CEV consortium or the Hyperledger Project. Only last year, a study by Greenwich Associates found 94 per cent of Wall Street bankers believed the blockchain had the potential to change the finance industry forever.

Banks are not the only ones whose imagination has been sparked. Information technology giants, journalists, academics, entrepreneurs, and venture capitalists alike are in a gold-rush mood. Academic work on bitcoin is skyrocketing. Publications in major journals rose 267 per cent in two years, according to Coindesk, a cryptocurrency news portal.

Magazines and institutes dedicated to the new technology mushroom across the globe, including at leading global institutions such as the “Digital Currency Initiative” at the MIT Media Lab.

The reason for this hype is clear: we are talking about a technology that, according to Spanish bank Santander, could save banks $15bn-$20bn a year from 2022 onwards — without their having to alter their business model.

While most experts see unprecedented potential in the blockchain, banks, payment processors, and credit card companies fret that brainy entrepreneurs who transform IQ points into dollars could cast a pall over their core business. They should. The attackers that are coming are not just another PayPal.

The online payments group simply added another layer on top of the existing financial system. The blockchain, on the other hand, holds the keys to changing the system from the ground up. But can start-ups really continue their impressive winning streak and dethrone the tight cartel of moneymen? And what role will cloud-giants and data behemoths play?

Blockchain pundits are a mixed bunch of tech-evangelists, anarcho-libertarians, banking gurus, and data-security sticklers. But while everybody has an opinion on the blockchain, there is little sound analysis that could link the technological potential to the market mechanisms described by decades of management theory. At this point “Blockchain Babel” sets in.

My book will unmask common myths about blockchain. I will argue that the major challengers to banks will not be fintech start-ups, but data collectors. Google and Apple have grown into Goliaths at unprecedented pace and banking is the last bastion they have not yet conquered.

They have corralled powerful brands and consumers’ trust, and they are expert at turning data to money. Though they remain pointedly silent about the blockchain, their strategic position and activities speak for themselves. To survive, banks have to turn from service providers into network orchestrators that will generate value through network capital.

The transformational power of blockchain technology is by no means limited to finance. It reaches from secure identification and e-voting, to new compensation models for artists and media professionals, all the way to smart contracts and property rights. But banking is the industry that will be transformed first and will have an impact on all others.

Smart contracts, for example, depend on the payment blockchain protocol. And ultimately, the financial system is the lifeblood of every economy. Which actors prevail in it, which strategy they select, and which business model they commit themselves to will have an impact everywhere.

Igor Pejic is Marketing Director at Austria Card, a European digital security company

* * * * * * * *

Mental Meltdown

How work stress becomes a ticking time bomb as millennials enter the workforce

By Nora Rosendahl

Nora Rosendahl

Doctors’ waiting rooms, it turns out, were good places for me to ask myself a lot of questions about the kind of life I was living.” — Arianna Huffington after her collapse in 2007.

The financial meltdown has caught all the media attention. Yet, an invisible and far more dangerous meltdown is happening in the shadows — a mental meltdown. We can now see the early symptoms: increasing rates of burnout, exhaustion and stress-related diseases. But the worst is yet to come, and the millennial generation is at greatest risk.

When we look at the statistics, we can only conclude that our workforce is slowly deteriorating from the inside.

An estimated one in four working-age women and men in the developed world are exhausted or burnt out. Depression will be the leading cause of disease globally by 2030, according to the World Health Organisation. In the UK, stress has already emerged as the top cause of illness.

This may sound like a topic for a mental health seminar, but the business world should take note.

What makes mental meltdown such a dangerous and costly enemy is that it has many consequences. The Benson-Henry Institute estimates that 60 to 90 per cent of doctor visits are to treat stress-related conditions. Studies show that US employers already spend 200 to 300 per cent more on indirect costs of healthcare — in the form of sick days, absenteeism, and lower productivity — than they do on actual healthcare payments. In Germany work days lost to psychological illness have gone up over 80 per cent in 15 years, and it is estimated that burnout is costing the country up to €10bn annually.

And these numbers do not even consider the really damaging consequences of exhaustion — poor decision-making, time and energy wasted on unnecessary tasks, a draining corporate culture, feelings of inadequacy as you put in the hours, yet still feel you are not doing enough.

Why is this happening? In the developed world the load on our mental resources has been slowly increasing, and powerful trends in society and the workplace are now further adding to it.

On a societal level, how we view success is fuelling the flames. Exhaustion and burnout are taboo topics. A study by Harvard Medical School showed that a staggering 96 per cent of leaders reported that they felt burnt out, yet no one talks about it. Especially among high-achieving knowledge workers, the unspoken rule is that there is nothing a bit of extra motivation, a double-espresso and a fake smile will not cure.

Fundamentally, how our society speaks about career success has given workaholism elite status. Digital devices keep us “always on”, adding to the mental load even when we are off work. The start-up boom is creating heroic stories of overcommitted entrepreneurs for the rest of us to admire.

In short, the output of a whole generation is at risk, and we are celebrating the disease.

In the workplace, the future reads like a recipe for added mental load. Artificial intelligence will threaten tens of millions of knowledge jobs, effectively pushing out comfortable routine tasks and increasing the need for high-quality thinking. Careers are becoming more fragmented: freelance and project work is on the rise; and those lucky enough to hold down a steady job face pressure to succeed and move on to the next role faster.

It gets worse. The millennial generation, only now entering the workforce in large numbers, seems to provide the most fertile ground for the mental meltdown. Already, as many as 25 per cent of US college students have a diagnosable mental illness.

Millennials are sadly often described with one word: “entitled”. This generation grew up as victims of the “self-esteem theory”, which links high self-esteem to good grades and career success. It was a well-intended theory, but along with parents dispensing praise and coaches giving out trophies just for participating, also comes a tendency towards self-absorption. For this generation, the aftermath of the financial crisis has been devastating. It is psychologically difficult to adapt to dwindling job opportunities and increased financial insecurity when you feel you are settling for less than you are worth.

Millennials are also the first generation to reach adulthood under the watchful eye of social media. Whereas earlier generations barely knew what happened outside their closest friends, millennials seem to have grown up knowing all activities of all their friends and acquaintances.

This nonstop social comparison leads to feelings of insecurity and self-consciousness.

These are just a few of the reasons millennials are now topping the stress charts. In a study by the American Psychological Association almost 40 per cent of millennials said their stress had increased over the past year (compared, for example, with 33 per cent for baby boomers). The consequences are clear and frightening. Already, 19 per cent of millennials have been diagnosed with depression, compared with 12 per cent of baby boomers.

When these people enter the workforce, symptoms that today may be only alarming become a ticking time bomb.

Nora Rosendahl is the co-founder of YOU-app, a self-improvement company with the mission to create billions of happier, healthier humans

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments