Integrity
Write
Loading...
Christian Soschner

Christian Soschner

3 years ago

Steve Jobs' Secrets Revealed

More on Leadership

KonstantinDr

KonstantinDr

3 years ago

Early Adopters And the Fifth Reason WHY

Product management wizardry.

Product management

Early adopters buy a product even if it hasn't hit the market or has flaws.

Who are the early adopters?

Early adopters try a new technology or product first. Early adopters are interested in trying or buying new technologies and products before others. They're risk-tolerant and can provide initial cash flow and product reviews. They help a company's new product or technology gain social proof.

Early adopters are most common in the technology industry, but they're in every industry. They don't follow the crowd. They seek innovation and report product flaws before mass production. If the product works well, the first users become loyal customers, and colleagues value their opinion.

What to do with early adopters?

They can be used to collect feedback and initial product promotion, first sales, and product value validation.

How to find early followers?

Start with your immediate environment and target audience. Communicate with them to see if they're interested in your value proposition.

1) Innovators (2.5% of the population) are risk-takers seeking novelty. These people are the first to buy new and trendy items and drive social innovation. However, these people are usually elite;

Early adopters (13.5%) are inclined to accept innovations but are more cautious than innovators; they start using novelties when innovators or famous people do;

3) The early majority (34%) is conservative; they start using new products when many people have mastered them. When the early majority accepted the innovation, it became ingrained in people's minds.

4) Attracting 34% of the population later means the novelty has become a mass-market product. Innovators are using newer products;

5) Laggards (16%) are the most conservative, usually elderly people who use the same products.

Stages of new information acceptance

1. The information is strange and rejected by most. Accepted only by innovators;

2. When early adopters join, more people believe it's not so bad; when a critical mass is reached, the novelty becomes fashionable and most people use it.

3. Fascination with a novelty peaks, then declines; the majority and laggards start using it later; novelty becomes obsolete; innovators master something new.

Problems with early implementation

Early adopter sales have disadvantages.

Higher risk of defects

Selling to first-time users increases the risk of defects. Early adopters are often influential, so this can affect the brand's and its products' long-term perception.

Not what was expected

First-time buyers may be disappointed by the product. Marketing messages can mislead consumers, and if the first users believe the company misrepresented the product, this will affect future sales.

Compatibility issues

Some technological advances cause compatibility issues. Consumers may be disappointed if new technology is incompatible with their electronics.

Method 5 WHY

Let's talk about 5 why, a good tool for finding project problems' root causes. This method is also known as the five why rule, method, or questions.

The 5 why technique came from Toyota's lean manufacturing and helps quickly determine a problem's root cause.

On one, two, and three, you simply do this:

  1. We identify and frame the issue for which a solution is sought.

  2. We frequently ponder this question. The first 2-3 responses are frequently very dull, making you want to give up on this pointless exercise. However, after that, things get interesting. And occasionally it's so fascinating that you question whether you really needed to know.

  3. We consider the final response, ponder it, and choose a course of action.

Always do the 5 whys with the customer or team to have a reasonable discussion and better understand what's happening.

And the “five whys” is a wonderful and simplest tool for introspection. With the accumulated practice, it is used almost automatically in any situation like “I can’t force myself to work, the mood is bad in the morning” or “why did I decide that I have no life without this food processor for 20,000 rubles, which will take half of my rather big kitchen.”

An illustration of the five whys

A simple, but real example from my work practice that I think is very indicative, given the participants' low IT skills.  Anonymized, of course.

Users spend too long looking for tender documents.

Why? Because they must search through many company tender documents.

Why? Because the system can't filter department-specific bids.

Why? Because our contract management system requirements didn't include a department-tender link. That's it, right? We'll add a filter and be happy. but still…

why? Because we based the system's requirements on regulations for working with paper tender documents (when they still had envelopes and autopsies), not electronic ones, and there was no search mechanism.

Why? We didn't consider how our work would change when switching from paper to electronic tenders when drafting the requirements.

Now I know what to do in the future. We add a filter, enter department data, and teach users to use it. This is tactical, but strategically we review the same forgotten requirements to make all the necessary changes in a package, plus we include it in the checklist for the acceptance of final requirements for the future.

Errors when using 5 why

Five whys seems simple, but it can be misused.

Popular ones:

  1. The accusation of everyone and everything is then introduced. After all, the 5 why method focuses on identifying the underlying causes rather than criticizing others. As a result, at the third step, it is not a good idea to conclude that the system is ineffective because users are stupid and that we can therefore do nothing about it.

  2. to fight with all my might so that the outcome would be exactly 5 reasons, neither more nor less. 5 questions is a typical number (it sounds nice, yes), but there could be 3 or 7 in actuality.

  3. Do not capture in-between responses. It is difficult to overestimate the power of the written or printed word, so the result is so-so when the focus is lost. That's it, I suppose. Simple, quick, and brilliant, like other project management tools.

Conclusion

Today we analyzed important study elements:

Early adopters and 5 WHY We've analyzed cases and live examples of how these methods help with product research and growth point identification. Next, consider the HADI cycle.

Thank you for your attention ❤️
Greg Satell

Greg Satell

3 years ago

Focus: The Deadly Strategic Idea You've Never Heard Of (But Definitely Need To Know!

Photo by Shane on Unsplash

Steve Jobs' initial mission at Apple in 1997 was to destroy. He killed the Newton PDA and Macintosh clones. Apple stopped trying to please everyone under Jobs.

Afterward, there were few highly targeted moves. First, the pink iMac. Modest success. The iPod, iPhone, and iPad made Apple the world's most valuable firm. Each maneuver changed the company's center of gravity and won.

That's the idea behind Schwerpunkt, a German military term meaning "focus." Jobs didn't need to win everywhere, just where it mattered, so he focused Apple's resources on a few key goods. Finding your Schwerpunkt is more important than charts and analysis for excellent strategy.

Comparison of Relative Strength and Relative Weakness

The iPod, Apple's first major hit after Jobs' return, didn't damage Microsoft and the PC, but instead focused Apple's emphasis on a fledgling, fragmented market that generated "sucky" products. Apple couldn't have taken on the computer titans at this stage, yet it beat them.

The move into music players used Apple's particular capabilities, especially its ability to build simple, easy-to-use interfaces. Jobs' charisma and stature, along his understanding of intellectual property rights from Pixar, helped him build up iTunes store, which was a quagmire at the time.

In Good Strategy | Bad Strategy, management researcher Richard Rumelt argues that good strategy uses relative strength to counter relative weakness. To discover your main point, determine your abilities and where to effectively use them.

Steve Jobs did that at Apple. Microsoft and Dell, who controlled the computer sector at the time, couldn't enter the music player business. Both sought to produce iPod competitors but failed. Apple's iPod was nobody else's focus.

Finding The Center of Attention

In a military engagement, leaders decide where to focus their efforts by assessing commanders intent, the situation on the ground, the topography, and the enemy's posture on that terrain. Officers spend their careers learning about schwerpunkt.

Business executives must assess internal strengths including personnel, technology, and information, market context, competitive environment, and external partner ecosystems. Steve Jobs was a master at analyzing forces when he returned to Apple.

He believed Apple could integrate technology and design for the iPod and that the digital music player industry sucked. By analyzing competitors' products, he was convinced he could produce a smash by putting 1000 tunes in my pocket.

The only difficulty was there wasn't the necessary technology. External ecosystems were needed. On a trip to Japan to meet with suppliers, a Toshiba engineer claimed the company had produced a tiny memory drive approximately the size of a silver dollar.

Jobs knew the memory drive was his focus. He wrote a $10 million cheque and acquired exclusive technical rights. For a time, none of his competitors would be able to recreate his iPod with the 1000 songs in my pocket.

How to Enter the OODA Loop

John Boyd invented the OODA loop as a pilot to better his own decision-making. First OBSERVE your surroundings, then ORIENT that information using previous knowledge and experiences. Then you DECIDE and ACT, which changes the circumstance you must observe, orient, decide, and act on.

Steve Jobs used the OODA loop to decide to give Toshiba $10 million for a technology it had no use for. He compared the new information with earlier observations about the digital music market.

Then something much more interesting happened. The iPod was an instant hit, changing competition. Other computer businesses that competed in laptops, desktops, and servers created digital music players. Microsoft's Zune came out in 2006, Dell's Digital Jukebox in 2004. Both flopped.

By then, Apple was poised to unveil the iPhone, which would cause its competitors to Observe, Orient, Decide, and Act. Boyd named this OODA Loop infiltration. They couldn't gain the initiative by constantly reacting to Apple.

Microsoft and Dell were titans back then, but it's hard to recall. Apple went from near bankruptcy to crushing its competition via Schwerpunkt.

Rather than a destination, it is a journey

Trying to win everywhere is a strategic blunder. Win significant fights, not trivial skirmishes. Identifying a focal point to direct resources and efforts is the essence of Schwerpunkt.

When Steve Jobs returned to Apple, PC firms were competing, but he focused on digital music players, and the iPod made Apple a player. He launched the iPhone when his competitors were still reacting. When Steve Jobs said, "One more thing," at the end of a product presentation, he had a new focus.

Schwerpunkt isn't static; it's dynamic. Jobs' ability to observe, refocus, and modify the competitive backdrop allowed Apple to innovate consistently. His strategy was tailored to Apple's capabilities, customers, and ecosystem. Microsoft or Dell, better suited for the enterprise sector, couldn't succeed with a comparable approach.

There is no optimal strategy, only ones suited to a given environment, when relative strength might be used against relative weakness. Discovering the center of gravity where you can break through is more of a journey than a destination; it will become evident after you reach.

Sean Bloomfield

Sean Bloomfield

3 years ago

How Jeff Bezos wins meetings over

Photo by Christian Wiediger on Unsplash

We've all been there: You propose a suggestion to your team at a meeting, and most people appear on board, but a handful or small minority aren't. How can we achieve collective buy-in when we need to go forward but don't know how to deal with some team members' perceived intransigence?

Steps:

  1. Investigate the divergent opinions: Begin by sincerely attempting to comprehend the viewpoint of your disagreeing coworkers. Maybe it makes sense to switch horses in the middle of the race. Have you completely overlooked a blind spot, such as a political concern that could arise as an unexpected result of proceeding? This is crucial to ensure that the person or people feel heard as well as to advance the goals of the team. Sometimes all individuals need is a little affirmation before they fully accept your point of view.

  • It says a lot about you as a leader to be someone who always lets the perceived greatest idea win, regardless of the originating channel, if after studying and evaluating you see the necessity to align with the divergent position.

  • If, after investigation and assessment, you determine that you must adhere to the original strategy, we go to Step 2.

2. Disagree and Commit: Jeff Bezos, CEO of Amazon, has had this experience, and Julie Zhuo describes how he handles it in her book The Making of a Manager.

It's OK to disagree when the team is moving in the right direction, but it's not OK to accidentally or purposefully damage the team's efforts because you disagree. Let the team know your opinion, but then help them achieve company goals even if they disagree. Unknown. You could be wrong in today's ever-changing environment.

So next time you have a team member who seems to be dissenting and you've tried the previous tactics, you may ask the individual in the meeting I understand you but I don't want us to leave without you on board I need your permission to commit to this approach would you give us your commitment?

You might also like

Theo Seeds

Theo Seeds

3 years ago

The nine novels that have fundamentally altered the way I view the world

I read 53 novels last year and hope to do so again.

Books are best if you love learning. You get a range of perspectives, unlike podcasts and YouTube channels where you get the same ones.

Book quality varies. I've read useless books. Most books teach me something.

These 9 novels have changed my outlook in recent years. They've made me rethink what I believed or introduced me to a fresh perspective that changed my worldview.

You can order these books yourself. Or, read my summaries to learn what I've synthesized.

Enjoy!

Fooled By Randomness

Nassim Taleb worked as a Wall Street analyst. He used options trading to bet on unlikely events like stock market crashes.

Using financial models, investors predict stock prices. The models assume constant, predictable company growth.

These models base their assumptions on historical data, so they assume the future will be like the past.

Fooled By Randomness argues that the future won't be like the past. We often see impossible market crashes like 2008's housing market collapse. The world changes too quickly to use historical data: by the time we understand how it works, it's changed.

Most people don't live to see history unfold. We think our childhood world will last forever. That goes double for stable societies like the U.S., which hasn't seen major turbulence in anyone's lifetime.

Fooled By Randomness taught me to expect the unexpected. The world is deceptive and rarely works as we expect. You can't always trust your past successes or what you've learned.

Antifragile

More Taleb. Some things, like the restaurant industry and the human body, improve under conditions of volatility and turbulence.

We didn't have a word for this counterintuitive concept until Taleb wrote Antifragile. The human body (which responds to some stressors, like exercise, by getting stronger) and the restaurant industry both benefit long-term from disorder (when economic turbulence happens, bad restaurants go out of business, improving the industry as a whole).

Many human systems are designed to minimize short-term variance because humans don't understand it. By eliminating short-term variation, we increase the likelihood of a major disaster.

Once, we put out every forest fire we found. Then, dead wood piled up in forests, causing catastrophic fires.

We don't like price changes, so politicians prop up markets with stimulus packages and printing money. This leads to a bigger crash later. Two years ago, we printed a ton of money for stimulus checks, and now we have double-digit inflation.

Antifragile taught me how important Plan B is. A system with one or two major weaknesses will fail. Make large systems redundant, foolproof, and change-responsive.

Reality is broken

We dread work. Work is tedious. Right?

Wrong. Work gives many people purpose. People are happiest when working. (That's why some are workaholics.)

Factory work saps your soul, office work is boring, and working for a large company you don't believe in and that operates unethically isn't satisfying.

Jane McGonigal says in Reality Is Broken that meaningful work makes us happy. People love games because they simulate good work. McGonigal says work should be more fun.

Some think they'd be happy on a private island sipping cocktails all day. That's not true. Without anything to do, most people would be bored. Unemployed people are miserable. Many retirees die within 2 years, much more than expected.

Instead of complaining, find meaningful work. If you don't like your job, it's because you're in the wrong environment. Find the right setting.

The Lean Startup

Before the airplane was invented, Harvard scientists researched flying machines. Who knew two North Carolina weirdos would beat them?

The Wright Brothers' plane design was key. Harvard researchers were mostly theoretical, designing an airplane on paper and trying to make it fly in theory. They'd build it, test it, and it wouldn't fly.

The Wright Brothers were different. They'd build a cheap plane, test it, and it'd crash. Then they'd learn from their mistakes, build another plane, and it'd crash.

They repeated this until they fixed all the problems and one of their planes stayed aloft.

Mistakes are considered bad. On the African savannah, one mistake meant death. Even today, if you make a costly mistake at work, you'll be fired as a scapegoat. Most people avoid failing.

In reality, making mistakes is the best way to learn.

Eric Reis offers an unintuitive recipe in The Lean Startup: come up with a hypothesis, test it, and fail. Then, try again with a new hypothesis. Keep trying, learning from each failure.

This is a great startup strategy. Startups are new businesses. Startups face uncertainty. Run lots of low-cost experiments to fail, learn, and succeed.

Don't fear failing. Low-cost failure is good because you learn more from it than you lose. As long as your worst-case scenario is acceptable, risk-taking is good.

The Sovereign Individual

Today, nation-states rule the world. The UN recognizes 195 countries, and they claim almost all land outside of Antarctica.

We agree. For the past 2,000 years, much of the world's territory was ungoverned.

Why today? Because technology has created incentives for nation-states for most of the past 500 years. The logic of violence favors nation-states, according to James Dale Davidson, author of the Sovereign Individual. Governments have a lot to gain by conquering as much territory as possible, so they do.

Not always. During the Dark Ages, Europe was fragmented and had few central governments. Partly because of armor. With armor, a sword, and a horse, you couldn't be stopped. Large states were hard to form because they rely on the threat of violence.

When gunpowder became popular in Europe, violence changed. In a world with guns, assembling large armies and conquest are cheaper.

James Dale Davidson says the internet will make nation-states obsolete. Most of the world's wealth will be online and in people's heads, making capital mobile.

Nation-states rely on predatory taxation of the rich to fund large militaries and welfare programs.

When capital is mobile, people can live anywhere in the world, Davidson says, making predatory taxation impossible. They're not bound by their job, land, or factory location. Wherever they're treated best.

Davidson says that over the next century, nation-states will collapse because they won't have enough money to operate as they do now. He imagines a world of small city-states, like Italy before 1900. (or Singapore today).

We've already seen some movement toward a more Sovereign Individual-like world. The pandemic proved large-scale remote work is possible, freeing workers from their location. Many cities and countries offer remote workers incentives to relocate.

Many Western businesspeople live in tax havens, and more people are renouncing their US citizenship due to high taxes. Increasing globalization has led to poor economic conditions and resentment among average people in the West, which is why politicians like Trump and Sanders rose to popularity with angry rhetoric, even though Obama rose to popularity with a more hopeful message.

The Sovereign Individual convinced me that the future will be different than Nassim Taleb's. Large countries like the U.S. will likely lose influence in the coming decades, while Portugal, Singapore, and Turkey will rise. If the trend toward less freedom continues, people may flee the West en masse.

So a traditional life of college, a big firm job, hard work, and corporate advancement may not be wise. Young people should learn as much as possible and develop flexible skills to adapt to the future.

Sapiens

Sapiens is a history of humanity, from proto-humans in Ethiopia to our internet society today, with some future speculation.

Sapiens views humans (and Homo sapiens) as a unique species on Earth. We were animals 100,000 years ago. We're slowly becoming gods, able to affect the climate, travel to every corner of the Earth (and the Moon), build weapons that can kill us all, and wipe out thousands of species.

Sapiens examines what makes Homo sapiens unique. Humans can believe in myths like religion, money, and human-made entities like countries and LLCs.

These myths facilitate large-scale cooperation. Ants from the same colony can cooperate. Any two humans can trade, though. Even if they're not genetically related, large groups can bond over religion and nationality.

Combine that with intelligence, and you have a species capable of amazing feats.

Sapiens may make your head explode because it looks at the world without presupposing values, unlike most books. It questions things that aren't usually questioned and says provocative things.

It also shows how human history works. It may help you understand and predict the world. Maybe.

The 4-hour Workweek

Things can be done better.

Tradition, laziness, bad bosses, or incentive structures cause complacency. If you're willing to make changes and not settle for the status quo, you can do whatever you do better and achieve more in less time.

The Four-Hour Work Week advocates this. Tim Ferriss explains how he made more sales in 2 hours than his 8-hour-a-day colleagues.

By firing 2 of his most annoying customers and empowering his customer service reps to make more decisions, he was able to leave his business and travel to Europe.

Ferriss shows how to escape your 9-to-5, outsource your life, develop a business that feeds you with little time, and go on mini-retirement adventures abroad.

Don't accept the status quo. Instead, level up. Find a way to improve your results. And try new things.

Why Nations Fail

Nogales, Arizona and Mexico were once one town. The US/Mexico border was arbitrarily drawn.

Both towns have similar cultures and populations. Nogales, Arizona is well-developed and has a high standard of living. Nogales, Mexico is underdeveloped and has a low standard of living. Whoa!

Why Nations Fail explains how government-created institutions affect country development. Strong property rights, capitalism, and non-corrupt governments promote development. Countries without capitalism, strong property rights, or corrupt governments don't develop.

Successful countries must also embrace creative destruction. They must offer ordinary citizens a way to improve their lot by creating value for others, not reducing them to slaves, serfs, or peasants. Authors say that ordinary people could get rich on trading expeditions in 11th-century Venice.

East and West Germany and North and South Korea have different economies because their citizens are motivated differently. It explains why Chile, China, and Singapore grow so quickly after becoming market economies.

People have spent a lot of money on third-world poverty. According to Why Nations Fail, education and infrastructure aren't the answer. Developing nations must adopt free-market economic policies.

Elon Musk

Elon Musk is the world's richest man, but that’s not a good way to describe him. Elon Musk is the world's richest man, which is like calling Steve Jobs a turtleneck-wearer or Benjamin Franklin a printer.

Elon Musk does cool sci-fi stuff to help humanity avoid existential threats.

Oil will run out. We've delayed this by developing better extraction methods. We only have so much nonrenewable oil.

Our society is doomed if it depends on oil. Elon Musk invested heavily in Tesla and SolarCity to speed the shift to renewable energy.

Musk worries about AI: we'll build machines smarter than us. We won't be able to stop these machines if something goes wrong, just like cows can't fight humans. Neuralink: we need to be smarter to compete with AI when the time comes.

If Earth becomes uninhabitable, we need a backup plan. Asteroid or nuclear war could strike Earth at any moment. We may not have much time to react if it happens in a few days. We must build a new civilization while times are good and resources are plentiful.

Short-term problems dominate our politics, but long-term issues are more important. Long-term problems can cause mass casualties and homelessness. Musk demonstrates how to think long-term.

The main reason people are impressed by Elon Musk, and why Ashlee Vances' biography influenced me so much, is that he does impossible things.

Electric cars were once considered unprofitable, but Tesla has made them mainstream. SpaceX is the world's largest private space company.

People lack imagination and dismiss ununderstood ideas as impossible. Humanity is about pushing limits. Don't worry if your dreams seem impossible. Try it.

Thanks for reading.

Julie Plavnik

Julie Plavnik

3 years ago

Why the Creator Economy needs a Web3 upgrade

Looking back into the past can help you understand what's happening today and why.

The Creator Economy

"Creator economy" conjures up images of originality, sincerity, and passion. Where do Michelangelos and da Vincis push advancement with their gifts without battling for bread and proving themselves posthumously? 

Creativity has been as long as humanity, but it's just recently become a new economic paradigm. We even talk about Web3 now.

Let's examine the creative economy's history to better comprehend it. What brought us here? Looking back can help you understand what's happening now.

No yawning, I promise 😉.

Creator Economy's history

Long, uneven transition to creator economy. Let's examine the economic and societal changes that led us there.

1. Agriculture to industry

Mid-18th-century Industrial Revolution led to shift from agriculture to manufacturing. The industrial economy lasted until World War II.

The industrial economy's principal goal was to provide more affordable, accessible commodities.

Unlike today, products were scarce and inaccessible.

To fulfill its goals, industrialization triggered enormous economic changes, moving power from agrarians to manufacturers. Industrialization brought hard work, rivalry, and new ideas connected to production and automation. Creative thinkers focused on that then.

It doesn't mean music, poetry, or painting had no place back then. They weren't top priority. Artists were independent. The creative field wasn't considered a different economic subdivision.

2. The consumer economy

Manufacturers produced more things than consumers desired after World War II. Stuff was no longer scarce.

The economy must make customers want to buy what the market offers.

The consumer economic paradigm supplanted the industrial one. Customers (or consumers) replaced producers as the new economic center.

Salesmen, marketing, and journalists also played key roles (TV, radio, newspapers, etc.). Mass media greatly boosted demand for goods, defined trends, and changed views regarding nearly everything.

Mass media also gave rise to pop culture, which focuses on mass-market creative products. Design, printing, publishing, multi-media, audio-visual, cinematographic productions, etc. supported pop culture.

The consumer paradigm generated creative occupations and activities, unlike the industrial economy. Creativity was limited by the need for wide appeal.

Most creators were corporate employees.

Creating a following and making a living from it were difficult.

Paul Saffo said that only journalists and TV workers were known. Creators who wished to be known relied on producers, publishers, and other gatekeepers. To win their favor was crucial. Luck was the best tactic.

3. The creative economy

Consumer economy was digitized in the 1990s. IT solutions transformed several economic segments. This new digital economy demanded innovative, digital creativity.

Later, states declared innovation a "valuable asset that creates money and jobs." They also introduced the "creative industries" and the "creative economy" (not creator!) and tasked themselves with supporting them. Australia and the UK were early adopters.

Individual skill, innovation, and intellectual property fueled the creative economy. Its span covered design, writing, audio, video material, etc. The creative economy required IT-powered activity.

The new challenge was to introduce innovations to most economic segments and meet demand for digital products and services.

Despite what the title "creative economy" may imply, it was primarily oriented at meeting consumer needs. It didn't provide inventors any new options to become entrepreneurs. Instead of encouraging innovators to flourish on their own, the creative economy emphasized "employment-based creativity."

4. The creator economy

Next, huge IT platforms like Google, Facebook, YouTube, and others competed with traditional mainstream media.

During the 2008 global financial crisis, these mediums surpassed traditional media. People relied on them for information, knowledge, and networking. That was a digital media revolution. The creator economy started there.

The new economic paradigm aimed to engage and convert clients. The creator economy allowed customers to engage, interact, and provide value, unlike the consumer economy. It gave them instruments to promote themselves as "products" and make money.

Writers, singers, painters, and other creators have a great way to reach fans. Instead of appeasing old-fashioned gatekeepers (producers, casting managers, publishers, etc.), they can use the platforms to express their talent and gain admirers. Barriers fell.

It's not only for pros. Everyone with a laptop and internet can now create.

2022 creator economy:

Since there is no academic description for the current creator economy, we can freestyle.

The current (or Web2) creator economy is fueled by interactive digital platforms, marketplaces, and tools that allow users to access, produce, and monetize content.

No entry hurdles or casting in the creative economy. Sign up and follow platforms' rules. Trick: A platform's algorithm aggregates your data and tracks you. This is the payment for participation.

The platforms offer content creation, design, and ad distribution options. This is platforms' main revenue source.

The creator economy opens many avenues for creators to monetize their work. Artists can now earn money through advertising, tipping, brand sponsorship, affiliate links, streaming, and other digital marketing activities.

Even if your content isn't digital, you can utilize platforms to promote it, interact and convert your audience, and more. No limits. However, some of your income always goes to a platform (well, a huge one).

The creator economy aims to empower online entrepreneurship by offering digital marketing tools and reducing impediments.

Barriers remain. They are just different. Next articles will examine these.

Why update the creator economy for Web3?

I could address this question by listing the present creator economy's difficulties that led us to contemplate a Web3 upgrade.

I don't think these difficulties are the main cause. The mentality shift made us see these challenges and understand there was a better reality without them.

Crypto drove this thinking shift. It promoted disintermediation, independence from third-party service providers, 100% data ownership, and self-sovereignty. Crypto has changed the way we view everyday things.

Crypto's disruptive mission has migrated to other economic segments. It's now called Web3. Web3's creator economy is unique.

Here's the essence of the Web3 economy:

  • Eliminating middlemen between creators and fans.

  • 100% of creators' data, brand, and effort.

  • Business and money-making transparency.

  • Authentic originality above ad-driven content.

In the next several articles, I'll explain. We'll also discuss the creator economy and Web3's remedies.

Final thoughts

The creator economy is the organic developmental stage we've reached after all these social and economic transformations.

The Web3 paradigm of the creator economy intends to allow creators to construct their own independent "open economy" and directly monetize it without a third party.

If this approach succeeds, we may enter a new era of wealth creation where producers aren't only the products. New economies will emerge.


This article is a summary. To read the full post, click here.

Amelia Winger-Bearskin

Amelia Winger-Bearskin

3 years ago

Reasons Why AI-Generated Images Remind Me of Nightmares

AI images are like funhouse mirrors.

Google's AI Blog introduced the puppy-slug in the summer of 2015.

Vice / DeepDream

Puppy-slug isn't a single image or character. "Puppy-slug" refers to Google's DeepDream's unsettling psychedelia. This tool uses convolutional neural networks to train models to recognize dataset entities. If researchers feed the model millions of dog pictures, the network will learn to recognize a dog.

DeepDream used neural networks to analyze and classify image data as well as generate its own images. DeepDream's early examples were created by training a convolutional network on dog images and asking it to add "dog-ness" to other images. The models analyzed images to find dog-like pixels and modified surrounding pixels to highlight them.

Puppy-slugs and other DeepDream images are ugly. Even when they don't trigger my trypophobia, they give me vertigo when my mind tries to reconcile familiar features and forms in unnatural, physically impossible arrangements. I feel like I've been poisoned by a forbidden mushroom or a noxious toad. I'm a Lovecraft character going mad from extradimensional exposure. They're gross!

Is this really how AIs see the world? This is possibly an even more unsettling topic that DeepDream raises than the blatant abjection of the images.

When these photographs originally circulated online, many friends were startled and scandalized. People imagined a computer's imagination would be literal, accurate, and boring. We didn't expect vivid hallucinations and organic-looking formations.

DeepDream's images didn't really show the machines' imaginations, at least not in the way that scared some people. DeepDream displays data visualizations. DeepDream reveals the "black box" of convolutional network training.

Some of these images look scary because the models don't "know" anything, at least not in the way we do.

These images are the result of advanced algorithms and calculators that compare pixel values. They can spot and reproduce trends from training data, but can't interpret it. If so, they'd know dogs have two eyes and one face per head. If machines can think creatively, they're keeping it quiet.

You could be forgiven for thinking otherwise, given OpenAI's Dall-impressive E's results. From a technological perspective, it's incredible.

Arthur C. Clarke once said, "Any sufficiently advanced technology is indistinguishable from magic." Dall-magic E's requires a lot of math, computer science, processing power, and research. OpenAI did a great job, and we should applaud them.

Dall-E and similar tools match words and phrases to image data to train generative models. Matching text to images requires sorting and defining the images. Untold millions of low-wage data entry workers, content creators optimizing images for SEO, and anyone who has used a Captcha to access a website make these decisions. These people could live and die without receiving credit for their work, even though the project wouldn't exist without them.

This technique produces images that are less like paintings and more like mirrors that reflect our own beliefs and ideals back at us, albeit via a very complex prism. Due to the limitations and biases that these models portray, we must exercise caution when viewing these images.

The issue was succinctly articulated by artist Mimi Onuoha in her piece "On Algorithmic Violence":

As we continue to see the rise of algorithms being used for civic, social, and cultural decision-making, it becomes that much more important that we name the reality that we are seeing. Not because it is exceptional, but because it is ubiquitous. Not because it creates new inequities, but because it has the power to cloak and amplify existing ones. Not because it is on the horizon, but because it is already here.