Applying Military Lessons to Business

July 27, 2018

|

Bob Kelly

The three years I spent in the Army had a profoundly positive impact on my life. I firmly believe serving in the military was one of my greatest honors and should be mandatory for every US citizen. During those three years, we were all trained on the usage of various weapons and survival on the battlefield. At that time, I thought that I would never use these skills after I left the service. Well, I was wrong. I found out later many of these lessons would be applicable in the civilian business world. The central theme of every lesson was the importance of "action." The performance of action is critical to both the battlefield and also in the business world. Here's what I mean:

Stealth and quickness... until you're engaged 

Army: Remain invisible to your enemy. Never reveal your position.

Business: Remain unseen until you have a business model that works and has paying customers. Staying in a stealth position and do not "shout from the highest tree." When competing against a larger company, avoid "pushing out your chest" before the time is right. Larger companies can bury smaller companies if you engage too soon and on their terms. On the other hand, smaller businesses have the strength of agility. Smaller companies can respond quickly to change and can make decisions quicker.

When in the fight, "bring it ALL"

Army: When you are engaged, the enemy will access your assets and location, as quickly as they can. By surprising them with everything and from multiple locations, you will disorient the enemy. You will appear more substantial than you might be and this is good. 

Business: When you are ready to bring your product or service to the marketplace, do it. Don't slowly bring it out (drip-by-drip). If you genuinely have a better idea, companies with larger budgets will steal your idea and incorporate your design and make it theirs. Instead, launch into the marketplace boldly but not arrogantly.

Quickly, shorten the distance with your objective

Army: If you are forced to engage up close (aka hand-to-hand), it's essential to fight the enemy on your terms. Doing this will give you focus and control, and prevent reacting to every situation.

Business: Identify and focus on achieving specific business objectives. Assign a focus (or responsibility) to each person on your team and have them "own it." Define who will focus on sales, customers support, accounting, etc.

Dominate your objective quickly 

Army: Once you've closed the distance with your enemy, dominate them.

Business: As a business owner, your objective might be to be the "number 1 product or service" in the world. Even if it's a lofty goal, Win the closest battles by starting small and then expanding your geography. 

Hold your ground and don't give it up 

Army: After you've fought to achieve your ground, maintain it! Don't give it up.

Business: Once you've achieved your business objective, or gained a specific customer, stay focused on maintaining that goal. I can guarantee you that your competition is plotting to take away what you've just achieved.

After Achieving Your Objective, Quickly Move Out 

Army: Once you've secured your new area, move on to the next objective. The enemy that you've just supplanted will now be planning to regain what they just lost. Your enemy now knows your size, your tactics, and what to expect. If you stay where you're at, you'll go from being the "hunter" to the "hunted." Bring that same fight somewhere else.

Business: So many times, we see companies become number 1 in their field, become complacent and die. Your competition will continuously be looking for ways to outperform you. It's at this time that the leaders need to lead the company to achieve the next objective.

Read more...

Getting My Big Data Experience Noticed on LinkedIn

The social media service LinkedIn (LI) serves multiple purposes for many people. There are some who use it for searching for a new job, some for professional networking, some for staying current with technical trends, etc. I think it’s safe to say that most people are not passive about how they use this service. Simply put, they use LI as a business intelligence platform. On a daily basis, users take LI’s collected data and convert it into useful information

I am a member of a big data and analytics technical recruiting team. On any given workday we review hundreds of LI profiles. We’ve seen a lot of good ones and also we’ve seen a lot of bad ones. So the question is what do these noticeable profiles look like? Let’s take a closer look.

In 2017, the latest count from LI is that there are about 467 million user profiles. This is an impressive number but for many users, this is an intimidating number, especially if that person wants to stand out and be noticed. You might be saying to yourself that since you’ve completed your LI profile, you’re done. Right? …Wrong!Setting yourself apart from the ocean of competing profiles is not easy, nor will it automatically be done for you. If you’ve ever wondered why companies with new job opportunities might not be contacting you, the chances are there’s a problem with your LI profile. Here are just a few “do’s and don’ts” that might help you while putting your profile together.

Add a Professional Headshot Photo To Your Profile

The expression holds true “You only have one chance to make a first impression.” Also, it’s important to remember that LinkedIn profiles with professional looking headshots get 14 times more profile views and is 36 times more likely to receive a LinkedIn message. You don’t have to spend a lot of money on professional headshots. Just remember that this is the public’s first impression of you as a professional, so make your photo look professional.

Don’t Overlook The Skills Section

I have seen too many people that don’t take look this section seriously. This section is especially important when it comes to looking for a new job. I recommend that all LI users consider the following when completing this section:

  1. Think about the type of a professional are you? (Hadoop Admin., Developer, Architect, Manager, Analyst, Team Lead, etc.)

  2. What professional focus are you looking to convey? (IT Strategy, Business Intelligence, Data Science, Leadership, etc.)

  3. What specific skills and technology experience do you have that apply to your profession? (Java, Denodo, Data Integration, HortonworksHazelcast, Algorithms, Bayesian Statistics, etc.)

Now, list those specific skills and technologies in this section or, at a minimum, somewhere in your professional narrative.

Eliminate “Eye-Rollers”

Remove the nebulous adjectives and technologies that are common among almost all technologists. (e.g. Team Player, Microsoft Word, Microsoft Excel, etc.) When you include these types of “eye-roller” skills, it shows that you’re struggling to put something in there. Think about it...how many technologists do you know that don’t know how to use Microsoft Word? Kudos to you, Microsoft!

Knowledge is "Gud"

Nothing will disqualify your professional profile faster than having misspellings, and/or the poor use of grammar. I am not a college English major, but I can still clearly articulate my professional experience by using well-constructed sentences. If you need help in this area, there are many free applications (e.g. Grammarly) that can help.

It’s very important to know that most (if not all) hiring companies will use LI at some point during their search. When they find errors with spelling and/or grammar in a person’s profile, it is perceived as a negative. In many cases, they considered these errors careless work and it’s just a demonstration of the quality of work that one can expect from this person. Your LI profile (as well as your resume) needs to be as close to perfect as possible.

Matching Dates

More than ever before, companies look for discrepancies when they consider candidates. They will do this by comparing the content and employment dates of a candidate's LI profile with what is stated on their resume. While it may be a simple oversight by the candidate, it can also be viewed as misleading and possibly deceptive behavior. Either way, it will almost certainly raise a cautionary flag.

Some Final Thoughts:

  • Make sure your LI profile stands-out for the right reasons.

  • Update your LI profile and resume regularly.

  • Make sure that you list all of the relevant technologies related to your professional life.

Words Are Important

The fastest way to be discovered is to pay close attention to the words you use. Recruiters and hiring companies search for qualified people on LI. They do this by using “keyword” searches and Boolean logic. You might be the most qualified “Hadoop Developer” on the planet for a newly listed job, but if your LI profile doesn’t mention the word “Hadoop” or if you misspell the word of a critical skill that qualifies you for the job…you're virtually invisible.

Read more...

Realizing The Potential Of Big Data And Analytics – Forbes

It’s not what you know. It’s what you do with what you know. That’s something companies worldwide will be learning—for better or worse—in the coming year when it comes to big data.

Gurus among us have proclaimed 2017 will be the year big data goes mainstream. If you’re anything like me, you may be wondering if that has already happened and if not, why? Even many teenagers I know use Google Analytics to monitor their daily “brand.” The truth is 2017 marks an even more meaningful shift when it comes to using data in business. For the first time, it will drive business operations, rather than simply reflecting performance. That’s a powerful proposition for those who use analytics effectively. On the flip side: It could be absolutely devastating for companies who are falling behind.

If you’re worried your business may have missed the big data boat, you’re in good company. According to the Harvard Business Review, a majority of today’s businesses are “nowhere close” to recognizing the value analytics can bring. The reasons are all-too-familiar: Lack of vision, lack of communication, lack of an actual plan. The good news: you can do something about it. Below are just a few things to keep in mind as you assemble—or re-assemble—your strategic big data plan.

Find A Champion

Like most big changes in any company, management plays a huge role in how quickly their companies will adopt—and adapt—to it. If you are planning to introduce or enhance your analytics platform, establish supporters and mentors in every sector of the company. After all, data is never central. From your front-line customer service team to your top senior executives, the decisions you make based on results and reporting will reach every single layer of your company—including YOU.

Whether gathering data on the front end or making big decisions in the C Suite, every single person in your organization must buy into the value analytics brings. If not, you run two major risks. First, you could end up with dirty data, which is worthless when it comes to making good, solid business decisions. Second, you could amass tons of amazing data insights that are never utilized by your executive teams. Consider creating a dedicated communication campaign surrounding analytics to ensure full-scale penetration and success.

Daniel Newman is CEO of Broadsuite Media Group, principal analyst at Futurum and author of Building Dragons.

Read more...

Facebook shuts AI system after bots create own language

Days after Tesla CEO Elon Musk said that artificial intelligence (AI) was the biggest risk, Facebook has shut down one of its AI systems after chatbots started speaking in their own language defying the codes provided. According to a report in Tech Times on Sunday, the social media giant had to pull the plug on the AI system that its researchers were working on "because things got out of hand". "The AI did not start shutting down computers worldwide or something of the sort, but it stopped using English and started using a language that it created," the report noted. Initially the AI agents used English to converse with each other but they later created a new language that only AI systems could understand, thus, defying their purpose. This led Facebook researchers to shut down the AI systems and then force them to speak to each other only in English. In June, researchers from the Facebook AI Research Lab (FAIR) found that while they were busy trying to improve chatbots, the "dialogue agents" were creating their own language. Soon, the bots began to deviate from the scripted norms and started communicating in an entirely new language which they created without human input, media reports said. Using machine learning algorithms, the "dialogue agents" were left to converse freely in an attempt to strengthen their conversational skills. The researchers also found these bots to be "incredibly crafty negotiators". "After learning to negotiate, the bots relied on machine learning and advanced strategies in an attempt to improve the outcome of these negotiations," the report said. "Over time, the bots became quite skilled at it and even began feigning interest in one item in order to 'sacrifice' it at a later stage in the negotiation as a faux compromise," it added. Although this appears to be a huge leap for AI, several experts including Professor Stephen Hawking have raised fears that humans, who are limited by slow biological evolution, could be superseded by AI. Others like Tesla's Elon Musk, philanthropist Bill Gates and ex-Apple founder Steve Wozniak have also expressed their concerns about where the AI technology was heading. Interestingly, this incident took place just days after a verbal spat between Facebook CEO and Musk who exchanged harsh words over a debate on the future of AI. "I've talked to Mark about this (AI). His understanding of the subject is limited," Musk tweeted last week. The tweet came after Zuckerberg, during a Facebook livestream earlier this month, castigated Musk for arguing that care and regulation was needed to safeguard the future if AI becomes mainstream. "I think people who are naysayers and try to drum up these doomsday scenarios -- I just, I don't understand it. It's really negative and in some ways I actually think it is pretty irresponsible," Zuckerberg said. Musk has been speaking frequently on AI and has called its progress the "biggest risk we face as a civilisation". "AI is a rare case where we need to be proactive in regulation instead of reactive because if we're reactive in AI regulation it's too late," he said.
Read more...

In Blockchain We Trust

To understand why blockchain matters, look past the wild speculation at what is being built underneath, argue the authors of The Age of Cryptocurrency and its newly published follow-up, The Truth Machine: The Blockchain and the Future of Everything.

The dot-com bubble of the 1990s is popularly viewed as a period of crazy excess that ended with hundreds of billions of dollars of wealth being destroyed. What’s less often discussed is how all the cheap capital of the boom years helped fund the infrastructure upon which the most important internet innovations would be built after the bubble burst. It paid for the rollout of fiber-optic cable, R&D in 3G networks, and the buildout of giant server farms. All of this would make possible the technologies that are now the bedrock of the world’s most powerful companies: algorithmic search, social media, mobile computing, cloud services, big-data analytics, AI, and more.

We think something similar is happening behind the wild volatility and stratospheric hype of the cryptocurrency and blockchain boom. The blockchain skeptics have crowed gleefully as crypto-token prices have tumbled from last year’s dizzying highs, but they make the same mistake as the crypto fanboys they mock: they conflate price with inherent value. We can’t yet predict what the blue-chip industries built on blockchain technology will be, but we are confident that they will exist, because the technology itself is all about creating one priceless asset: trust.

To understand why, we need to go back to the 14th century.

That was when Italian merchants and bankers began using the double-entry bookkeeping method. This method, made possible by the adoption of Arabic numerals, gave merchants a more reliable record-keeping tool, and it let bankers assume a powerful new role as middlemen in the international payments system. Yet it wasn’t just the tool itself that made way for modern finance. It was how it was inserted into the culture of the day.

In 1494 Luca Pacioli, a Franciscan friar and mathematician, codified their practices by publishing a manual on math and accounting that presented double-entry bookkeeping not only as a way to track accounts but as a moral obligation. The way Pacioli described it, for everything of value that merchants or bankers took in, they had to give something back. Hence the use of offsetting entries to record separate, balancing values—a debit matched with a credit, an asset with a liability.

SELMAN DESIGN

Pacioli’s morally upright accounting bestowed a form of religious benediction on these previously disparaged professions. Over the next several centuries, clean books came to be regarded as a sign of honesty and piety, clearing bankers to become payment intermediaries and speeding up the circulation of money. That funded the Renaissance and paved the way for the capitalist explosion that would change the world.

Yet the system was not impervious to fraud. Bankers and other financial actors often breached their moral duty to keep honest books, and they still do—just ask Bernie Madoff’s clients or Enron’s shareholders. Moreover, even when they are honest, their honesty comes at a price. We’ve allowed centralized trust managers such as banks, stock exchanges, and other financial middlemen to become indispensable, and this has turned them from intermediaries into gatekeepers. They charge fees and restrict access, creating friction, curtailing innovation, and strengthening their market dominance.

The real promise of blockchain technology, then, is not that it could make you a billionaire overnight or give you a way to shield your financial activities from nosy governments. It’s that it could drastically reduce the cost of trust by means of a radical, decentralized approach to accounting—and, by extension, create a new way to structure economic organizations.

The need for trust and middlemen allows behemoths such as Google, Facebook, and Amazon to turn economies of scale and network effects into de facto monopolies.

A new form of bookkeeping might seem like a dull accomplishment. Yet for thousands of years, going back to Hammurabi’s Babylon, ledgers have been the bedrock of civilization. That’s because the exchanges of value on which society is founded require us to trust each other’s claims about what we own, what we’re owed, and what we owe. To achieve that trust, we need a common system for keeping track of our transactions, a system that gives definition and order to society itself. How else would we know that Jeff Bezos is the world’s richest human being, that the GDP of Argentina is $620 billion, that 71 percent of the world’s population lives on less than $10 a day, or that Apple’s shares are trading at a particular multiple of the company’s earnings per share?

A blockchain (though the term is bandied about loosely, and often misapplied to things that are not really blockchains) is an electronic ledger—a list of transactions. Those transactions can in principle represent almost anything. They could be actual exchanges of money, as they are on the blockchains that underlie cryptocurrencies like Bitcoin. They could mark exchanges of other assets, such as digital stock certificates. They could represent instructions, such as orders to buy or sell a stock. They could include so-called smart contracts, which are computerized instructions to do something (e.g., buy a stock) if something else is true (the price of the stock has dropped below $10).

What makes a blockchain a special kind of ledger is that instead of being managed by a single centralized institution, such as a bank or government agency, it is stored in multiple copies on multiple independent computers within a decentralized network. No single entity controls the ledger. Any of the computers on the network can make a change to the ledger, but only by following rules dictated by a “consensus protocol,” a mathematical algorithm that requires a majority of the other computers on the network to agree with the change.

Once a consensus generated by that algorithm has been achieved, all the computers on the network update their copies of the ledger simultaneously. If any of them tries to add an entry to the ledger without this consensus, or to change an entry retroactively, the rest of the network automatically rejects the entry as invalid.

Typically, transactions are bundled together into blocks of a certain size that are chained together (hence “blockchain”) by cryptographic locks, themselves a product of the consensus algorithm. This produces an immutable, shared record of the “truth,” one that—if things have been set up right—cannot be tampered with.

Within this general framework are many variations. There are different kinds of consensus protocols, for example, and often disagreements over which kind is most secure. There are public, “permissionless” blockchain ledgers, to which in principle anyone can hitch a computer and become part of the network; these are what Bitcoin and most other cryptocurrencies belong to. There are also private, “permissioned” ledger systems that incorporate no digital currency. These might be used by a group of organizations that need a common record-keeping system but are independent of one another and perhaps don’t entirely trust one another—a manufacturer and its suppliers, for example.

The common thread between all of them is that mathematical rules and impregnable cryptography, rather than trust in fallible humans or institutions, are what guarantee the integrity of the ledger. It’s a version of what the cryptographer Ian Grigg described as “triple-entry bookkeeping”: one entry on the debit side, another for the credit, and a third into an immutable, undisputed, shared ledger.

The benefits of this decentralized model emerge when weighed against the current economic system’s cost of trust. Consider this: In 2007, Lehman Brothers reported record profits and revenue, all endorsed by its auditor, Ernst & Young. Nine months later, a nosedive in those same assets rendered the 158-year-old business bankrupt, triggering the biggest financial crisis in 80 years. Clearly, the valuations cited in the preceding years’ books were way off. And we later learned that Lehman’s ledger wasn’t the only one with dubious data. Banks in the US and Europe paid out hundreds of billions of dollars in fines and settlements to cover losses caused by inflated balance sheets. It was a powerful reminder of the high price we often pay for trusting centralized entities’ internally devised numbers.

 SELMAN DESIGN

The crisis was an extreme example of the cost of trust. But we also find that cost ingrained in most other areas of the economy. Think of all the accountants whose cubicles fill the skyscrapers of the world. Their jobs, reconciling their company’s ledgers with those of its business counterparts, exist because neither party trusts the other’s record. It is a time-consuming, expensive, yet necessary process.

Other manifestations of the cost of trust are felt not in what we do but in what we can’t do. Two billion people are denied bank accounts, which locks them out of the global economy because banks don’t trust the records of their assets and identities. Meanwhile, the internet of things, which it’s hoped will have billions of interacting autonomous devices forging new efficiencies, won’t be possible if gadget-to-gadget microtransactions require the prohibitively expensive intermediation of centrally controlled ledgers. There are many other examples of how this problem limits innovation.

These costs are rarely acknowledged or analyzed by the economics profession, perhaps because practices such as account reconciliation are assumed to be an integral, unavoidable feature of business (much as pre-internet businesses assumed they had no option but to pay large postal expenses to mail out monthly bills). Might this blind spot explain why some prominent economists are quick to dismiss blockchain technology? Many say they can’t see the justification for its costs. Yet their analyses typically don’t weigh those costs against the far-reaching societal cost of trust that the new models seek to overcome.

More and more people get it, however. Since Bitcoin’s low-key release in January 2009, the ranks of its advocates have swelled from libertarian-minded radicals to include former Wall Street professionals, Silicon Valley tech mavens, and development and aid experts from bodies such as the World Bank. Many see the technology’s rise as a vital new phase in the internet economy—one that is, arguably, even more transformative than the first. Whereas the first wave of online disruption saw brick-and-mortar businesses displaced by leaner digital intermediaries, this movement challenges the whole idea of for-profit middlemen altogether.

The need for trust, the cost of it, and the dependence on middlemen to provide it is one reason why behemoths such as Google, Facebook, and Amazon turn economies of scale and network-effect advantages into de facto monopolies. These giants are, in effect, centralized ledger keepers, building vast records of “transactions” in what is, arguably, the most important “currency” in the world: our digital data. In controlling those records, they control us.

The potential promise of overturning this entrenched, centralized system is an important factor behind the gold-rush-like scene in the crypto-token market, with its soaring yet volatile prices. No doubt many—perhaps most—investors are merely hoping to get rich quick and give little thought to why the technology matters. But manias like this, as irrational as they become, don’t spring out of nowhere. As with the arrival of past transformative platform technologies—railroads, for example, or electricity—rampant speculation is almost inevitable. That’s because when a big new idea comes along, investors have no framework for estimating how much value it will create or destroy, or for deciding which enterprises will win or lose.

Although there are still major obstacles to overcome before blockchains can fulfill the promise of a more robust system for recording and storing objective truth, these concepts are already being tested in the field.

Freely accessible open-source code is the foundation upon which the decentralized economy of the future will be built.

Companies such as IBM and Foxconn are exploiting the idea of immutability in projects that seek to unlock trade finance and make supply chains more transparent. Such transparency could also give consumers better information on the sources of what they buy—whether a T-shirt was made with sweatshop labor, for example.

Another important new idea is that of a digital asset. Before Bitcoin, nobody could own an asset in the digital realm. Since copying digital content is easy to do and difficult to stop, providers of digital products such as MP3 audio files or e-books never give customers outright ownership of the content, but instead lease it and define what users can do with it in a license, with stiff legal penalties if the license is broken. This is why you can make a 14-day loan of your Amazon Kindle book to a friend, but you can’t sell it or give it as a gift, as you might a paper book.

Bitcoin showed that an item of value could be both digital and verifiably unique. Since nobody can alter the ledger and “double-spend,” or duplicate, a bitcoin, it can be conceived of as a unique “thing” or asset. That means we can now represent any form of value—a property title or a music track, for example—as an entry in a blockchain transaction. And by digitizing different forms of value in this way, we can introduce software for managing the economy that operates around them.

As software-based items, these new digital assets can be given certain “If X, then Y” properties. In other words, money can become programmable. For example, you could pay to hire an electric vehicle using digital tokens that also serve to activate or disable its engine, thus fulfilling the encoded terms of a smart contract. It’s quite different from analog tokens such as banknotes or metal coins, which are agnostic about what they’re used for.

What makes these programmable money contracts “smart” is not that they’re automated; we already have that when our bank follows our programmed instructions to autopay our credit card bill every month. It’s that the computers executing the contract are monitored by a decentralized blockchain network. That assures all signatories to a smart contract that it will be carried out fairly.

With this technology, the computers of a shipper and an exporter, for example, could automate a transfer of ownership of goods once the decentralized software they both use sends a signal that a digital-currency payment—or a cryptographically unbreakable commitment to pay—has been made. Neither party necessarily trusts the other, but they can nonetheless carry out that automatic transfer without relying on a third party. In this way, smart contracts take automation to a new level—enabling a much more open, global set of relationships.

SELMAN DESIGN

Programmable money and smart contracts constitute a powerful way for communities to govern themselves in pursuit of common objectives. They even offer a potential breakthrough in the “Tragedy of the Commons,” the long-held notion that people can’t simultaneously serve their self-interest and the common good. That was evident in many of the blockchain proposals from the 100 software engineers who took part in Hack4Climate at last year’s UN climate-change conference in Bonn. The winning team, with a project called GainForest, is now developing a blockchain-based system by which donors can reward communities living in vulnerable rain forests for provable actions they take to restore the environment.

Still, this utopian, frictionless “token economy” is far from reality. Regulators in China, South Korea, and the US have cracked down on issuers and traders of tokens, viewing such currencies more as speculative get-rich-quick schemes that avoid securities laws than as world--changing new economic models. They’re not entirely wrong: some developers have pre-sold tokens in “initial coin offerings,” or ICOs, but haven’t used the money to build and market products. Public or “permissionless” blockchains like Bitcoin and Ethereum, which hold the greatest promise of absolute openness and immutability, are facing growing pains. Bitcoin still can’t process more than seven transactions a second, and transaction fees can sometimes spike, making it costly to use.

Meanwhile, the centralized institutions that should be vulnerable to disruption, such as banks, are digging in. They are protected by existing regulations, which are ostensibly imposed to keep them honest but inadvertently constitute a compliance cost for startups. Those regulations, such as the burdensome reporting and capital requirements that the New York State Department of Financial Services’ “BitLicense” imposed on cryptocurrency remittance startups, become barriers to entry that protect incumbents.

But here’s the thing: the open-source nature of blockchain technology, the excitement it has generated, and the rising value of the underlying tokens have encouraged a global pool of intelligent, impassioned, and financially motivated computer scientists to work on overcoming these limitations. It’s reasonable to assume they will constantly improve the tech. Just as we’ve seen with internet software, open, extensible protocols such as these can become powerful platforms for innovation. Blockchain technology is moving way too fast for us to think later versions won’t improve upon the present, whether it’s in Bitcoin’s cryptocurrency-based protocol, Ethereum’s smart-contract-focused blockchain, or some as-yet-undiscovered platform.

The crypto bubble, like the dot-com bubble, is creating the infrastructure that will enable the technologies of the future to be built. But there’s also a key difference. This time, the money being raised isn’t underwriting physical infrastructure but social infrastructure. It’s creating incentives to form global networks of collaborating developers, hive minds whose supply of interacting, iterative ideas is codified into lines of open-source software. That freely accessible code will enable the execution of countless as-yet-unimagined ideas. It is the foundation upon which the decentralized economy of the future will be built.

Just as few people in the mid-1990s could predict the later emergence of Google, Facebook, and Uber, we can’t predict what blockchain-based applications will emerge from the wreckage of this bubble to dominate the decentralized future. But that’s what you get with extensible platforms. Whether it’s the open protocols of the internet or the blockchain’s core components of algorithmic consensus and distributed record-keeping, their power lies in providing an entirely new paradigm for innovators ready to dream up and deploy world-changing applications. In this case, those applications—whatever shape they take—will be aimed squarely at disrupting many of the gatekeeping institutions that currently dominate our centralized economy.

Read more...