Artificial Intelligence or Natural Stupidity

My first foray into artificial intelligence was in the early nineties during my internship in one of the national labs. Computer was still new for me and my first and foremost goal was to learn programming. When I was asked to do a bunch of numerical methods and mathematical analysis in a large mainframe computer sitting in Arctic-like conditions – it was a a bummer to begin with. Not until the second week did I got my first taste of doing something that completely blew my mind, and changed my outlook on what I should be doing going forward. Well, computers have a different way to solve continuous functions, that I appreciated.  But then there was also pattern recognition. Applying advanced techniques to think about new ways to approximate polygons to have a much compressed data set. Who knew storage and connectivity costs would go down. This required serious thought. But after 3 months, I was able to produce something which my adviser was able to put in some good use, including a paper in his own right. But I continued reading – well thats all I could do on my own – fuzzy logic, neural networks – indeed, it was the buzz in the 90s. But then it all faded away … until recently.

While reading Michael Lewis’ fascinating book – “The Undoing Project, I came across Amos Tversky’s rather striking and humorous anecdote. He once was asked about what he was doing about artificial intelligence and his response was that he spent more time thinking about people showing natural stupidity.  This was a gem that can’t be left unused.  I have to admit that after spending almost 20 years in real life and more recently in the world of business management, dealing with natural stupidity presents lots of opportunities – as low hanging fruit that the use of artificial still seems rather far off, somewhat like why need an elephant gun for a swarm of mosquitoes, if there is any such thing.

But there might something else at play. It may be that to overcome natural stupidity – artificial intelligence might be quite effective. For the last few years we have been hearing buzz words like “big data” and that analytics will change everything for good – not the least of which is making lots of money for everyone in the business – if only if – we knew what to do with it. Well, artificial intelligence may have an answer. It might be able to draw associations and help guide the stupid human to show how the data may be useful. In particular, I am intrigued by the possibilities of unsupervised learning and techniques like Principal Component Analysis. Some parenting to ensure that the learning is useful will be needed but other than not much. I am excited and my spirits from my undergrad days are rekindled.

So looks like the logical path forward to the promised land of untapping this value maybe a little bit more Rousseau than Voltaire. What do we need? Give the tools for people to play with the data. Give them the time to play and learn. The art of playing and learning – the way I seen my 3 year old learn these days. Make, break, tinker and you will have something useful. All it takes are the tools and an open mind. But in the current business environment – which are hardwired to deadlines, meetings, and meeting targets, those are exactly the attributes that are in short supply. So I wonder how this will move forward. Will the industry make such huge bets unless we change on how we innovate and expand, extend, and enhance our ways. That has to be seen. Will our natural stupidity let us change our behaviors or hinder from it. That toorRemains to be seen. I will keep pondering about it.

Offshore wind – has the gold rush begun?

Fortune magazine has an interesting article on offshore wind in the US. Despite initial hiccups and challenges, there is the sweet taste of success. Here is the link to the story. The typical opposers complain about view and impact on endangered flora and fauna. Flora and fauna and in this case fishes and whales are one thing, the visual pollution is another. The same could be said for the sky pre-airlines – when the sky was perhaps bluer and and less smoggy. But then life has to move on.


Here is what the article says:

“For the first time, U.S. investors see a path to profitability. The gold rush has begun. In the U.S., 23 offshore wind projects totaling 16 gigawatts (GW), the equivalent of about 16 nuclear power plants, are on the drawing board. Almost all are located along the northeast coast. Over the past year, Denmark’s oil and gas giant Dong Energy bought federal leases off the coasts of Massachusetts and New Jersey. Norway’s Statoil won a 33-round auction to secure a 79,000-acre site south of Jones Beach on Long Island for $42.5 million, far more than the $16 million generated by all earlier offshore wind auctions combined. Shell has been sniffing around. Wall Street players such as Citigroup (C, -0.15%), HSBC (HSBC, -1.00%), and, as we’ll see, D.E. Shaw are lining up to finance the most promising projects.”

Looks like the good days might just have arrived.


De-prioritizing and De-regulating Reliability

Will reliability become less important in the future for electric utilities? A recent report by IEA “Re-powering Markets” raises the question – why regulate reliability? As we have seen in other asset heavy industries such as, airlines or telecom – following de-regulation, with robust price signals, markets can work reasonably well to balance demand and supply. Indeed there are some issues and many argue that customer service levels have gone down, but by and large the market has functioned well and reliability is no longer a regulated attribute. Safety yes, but not reliability or service levels.   

Utilities have historically focused on reliability as one of the most important requirements along with safe and affordable electricity. There are three principal components that serve in reliable grid operations – fuel security, system balancing, and resource adequacy. As long as the investments fall into one of these categories – there is a rationale for the regulators to fund and utilities allowed to include in their rate base.

Part of the reason why reliability is the dominant measure, is because there is no other measure or technology to dynamically know what is important for the customer. Barring large industrials who can communicate back to the utilities, the gross assumption is demand for electricity (or gas or water) is inelastic, so reliable supply is necessary to avoid rolling blackouts. Utilities therefore, are on the hook to make sure that they are supplying reliable power in the territory they are holding natural monopoly rights.

Digitization may change this paradigm. Reliability may be unbundled such that customers can choose plans that provide varying degrees of reliability based on utility, needs, and willingness to pay. In other words, if there is a way to balance the grid at all times, then with reliable price signals, demand and supply can be managed without a mandate for reliability. Technologies like micro-grids and other localized solutions can also play a big role in disaggregating reliable power. Controls at the customer level, such as smart meter controls can be used for the same purpose. This may lead to a fundamental shift in the industry. How soon will all these emerge? It all depends on how quickly technology can diffuse and enable these changes. For the near future, however, there is nothing much changing. Reliability will continue to be one of the important drivers. But that the question getting asked more and more means, that we should no longer take this granted.    

Use of Thought Experiments

What is a thought experiment?

The term “thought experiment” got its significance in connection with the genius of last century – Albert Einstein.  Recently with the empirical evidence supporting the gravitational waves, there has been much wonder placed on how Einstein over a century ago, could use thought experiment to devise the theoretical basis for gravitation.

“A thought experiment is a device with which one performs an intentional, structured process of intellectual deliberation in order to speculate, within a specifiable problem domain, about potential consequents (or antecedents) for a designated antecedent (or consequent)” (Yeates, 2004, p. 150).

For mere morals like me – it is dreaming.   It is how Einstein conceptualized a problem and then solved it.  In the domain of theoretical work there are not many options to set experiments to prove things empirically.  So how is this different from inductive reasoning?  Maybe not.  It does involve coming up with a guess and then using the known methods and theorems and theories and facts – try to prove or disprove the guess.  Experts have written volumes on what is a thought experiment and the philosophical basis.  As far as my mind goes – it is simply asking “what if”

A related question is what do you have to believe {y} to be for {x} to be true?  This is a powerful way to frame the inquiry rather than asking what will {x} be? This roots into the distinction of the ends versus the means.  A rational and scientific inquiry cannot deal with what the end should be in isolation. For example, what the company should do?  What its goals should be?  That is purely driven by one’s aspirations. Aspirations and goals are based on one’s values and that where lies the rub. Scientific inquiry cannot address one’s values. Therefore, science is limited here.  What science can do is to logically show what means are workable versus others. So to frame a question as what would one assume for a given end, it is a manageable examination. So the next time if the question is should we double our revenue in 2 years, the way to deal is not to argue on whether you can or cannot. But shift it to – what you have to believe has to happen to double the revenues in 2 years.

Rebranding Electricity

The utility industry has been virtually stagnant when it comes to introducing new products.  Many say electricity is the ultimate commodity – true. One cannot distinguish whether an electron was produced from a clean source like solar or a dirty source like a super polluting coal boiler, or an internal combustion engine or from a clean but potentially risky nuclear power plant.  So for many, the end product does not lend much variation for companies to play and win on differentiation and uniqueness.  As such the competitive forces have converged on the economics of production. Metrics such as levelized cost of electricity have been introduced to normalize comparison across the various technologies to get to apples to apples comparison, but it has limited use. Consumers don’t care much on the difference (or, very few do) and are more concerned about the reliability, safety and affordability of electricity.  But does this mean there is no possibility for differentiation? If water can be sold in bottles that compete with the faucet, then can electricity do the same?

Perhaps we need to start looking at other commodities to get some insights on consumer behavior.  Take for example, chicken or beef. Granted there are some variations with chicken and electricity. But these are commodities. When we go to a grocery store there are many kinds of chicken and beef.  Free-range chicken and hormone free beef typically costs more.  This provides some interesting cues to the way consumers care for what they buy.  This allows companies to target the segment aptly.  There are certain customers that care for how the product is made and not just the end product in itself. The how lends to the larger health issue.

Can this be relevant to electricity?  Electricity that comes from clean sources may be better for you than the electricity that comes from dirty sources. Not because of the electron per se but how they were made.  However, there is one fundamental difference between food and electricity production. Food directly has an impact on our body and what I eat has a more direct connection to my immediate well being than electricity.  At least that’s what I am led to believe. Same with water. With the recent crisis in Flint, Michigan, such scenarios are quite clear even in a developed country. Water may have another dimension. Consumers may trust a big company for quality related to health rather than expecting every water fountain to guarantee safety and quality.  The environment may be a close parallel with electricity.  We can hold someone responsible for the quality of air we breathe, and conversely we can seek a premium for better quality air. But we are not there yet. Generational impacts are longer time impacts that hit our psyche slowly.  For some it hits only after an adverse event.  For example, I find it a lot harder to resist food that is a slow acting time bomb on the body.  Foods that affect my cholesterol and heart, for example, compared to something that can cause immediate salmonella.

What does this tell us?

It tells us that consumer behavior, which like human behavior has something to do with the traits that we have never grown out of. Remember the famous Stanford marshmallow test.  Unless we have an adult posture that can put aside instant gratification for future joy, there is not much hope. All the concerns for global warming and willing to pay a price today for future benefit have a timeframe that is outside our concern. But that’s not the case for our future generation. And there may be immediate impacts that we are not smart enough to measure or notice.  That would be the starting point of creating something unique for branding electricity.  Just as we did with smoking by creating social awareness and banning smoking in public places, using clean energy needs to be drawn from a public health standpoint. It is going to be a long ride, but an important one.


Disruption – Over-used? What about “stack fallacy”?

Clay Christensen is the person behind making “disruption” a common term in business.  Lately he has been critical on how the term has been used for so many business changes. He has been critical on how everyone has been loosening its tie to the fundamental theory he proposed several years ago. In the December 2015 HBR issue, he elaborates and reminds everyone in this article.  Yet when I read this article on stack fallacy in the WSJ with its headline using disruption, I wonder whether the title “Why-big companies keep getting disrupted?” is appropriate.  Indeed, the article even has Christensen’s picture, leading the reader to believe that the WSJ author is paying an implied salutation to his theory. Not really.  This article depicts disruption as something more commonly observed with companies who drift down the stack. Companies that are built on top of underlying abstractions are better off in extending their product and offering boundaries. For example, is a more successful CRM player because it understands the user better than Oracle, because Salesforce is higher in the application stack built on top of Oracle.  Oracle, in contrast is a database provider and the user base of the CRM is not the same as the user base of databases. The stack fallacy says it is easier for to get into the successful database businesss than Oracle getting into a successful CRM business. Another advantage for flowing down the vertical integration is given that uses Oracle databases – it is a customer itself – so it already knows the key customer needs.  Not the same with Oracle – hence it is much harder for Oracle to be in the application business such as CRM.  The facts prove this thesis.  This is an astute observation and the article refers to this article  on “stack fallacy”.  It is quite an interesting read – so highly recommend it. By this token, Tesla is more likely to be successful in making batteries than Panasonic in making cars.  (Aside: While going through the comments – I also came another read on what specialization does).

Going back to the central point – I thought disruption meant coming from the low end of the market and gradually seeping into the target customers of the incumbents – to a point when customers switch en-masse.  This is not because incumbents are stupid or negligent but because they don’t believe that they are in the business in which the emerging companies are good at.  They also beleive that the emerging companies are not good enough for the incumbent’s customer base.  That happened with PCs and mainframes, digital cameras with film cameras, and then phone cameras with digital cameras.  But then this is not the same as “stack fallacy” discussed in the WSJ article.  Many would even argue that the theory of disruption itself has limited practical use. It is a beautiful theory to explain the past but what about its predictive ability. How do companies know that they will get disrupted?  That exercise falls on the basics of strategic analysis and indeed, what economics have dealt for a long time in demand estimation – role of substitutes.  If there is a viable substitute then, losing market share for the incumbent is inevitable.  Hence, if the word “disruption” is only meant for a specific trajectory – the path taken rather than the state itself, then the theory of disruption ends up more of an academic distinction. If we focus more on the state of customer switching and the drivers of losing one’s business then the topics can be subsumed within the long-standing Porter’s 5 forces framework on competitive strategy. Why need another theory?

I lean more on the latter.  Just a few principles should be enough to explain most things.  MMarkets and customers are in flux.  Their preferences and choices are dynamic and shift all the time, getting shaped by all sorts of things. Knowing them and serving them right is the key challenge all businesses have and those that do better win. The stack fallacy is an elegant way to explain but I would reserve how far I go to explain everything.

Germany’s electricity situation

From the American utility industry vantage point, the German electricity sector appears to be in turmoil. The energy transition known as energiewende brought about a series of changes – some intended and some a bit more than that. The impact of these changes are quite widespread raising debates and questions on both sides of the Atlantic.  Renewables proponents showcase Germany as the poster child of what policies should be.  Traditional utilities tout that the right policy is exactly opposite of what Germany did.  Indeed, many believe good intentions gone bad.  Here are the key issues:

First, the phasing out of nuclear power by 2022. Germany has currently 9 nuclear reactors that generates about 15% of total electricity. After the Fukushima disaster Germany created a mandate to phase out all nuclear generation by the year 2022. Is this the right move? Is this an overreaction? Without securing backup generation, this is certainly turning out to be a hasty commitment that has become a political hot potato – too hot to touch anymore.

Second, the introduction and then proliferation of the Feed in Tariffs (FiT). Many countries have granted subsidies in various forms – FiT, tax credits, government grants, etc. to support the growth in solar PV. The way German FiT policies work is it requires transmission companies to buy solar PV at wholesale rates and then sell it to the retail customers. Assuming solar PV generated power in more expensive, the PV owner is made whole by a Feed in Tarriff that are ultimately paid for by the retail customers. Thankfully, the industrial customers are exempted which kept the energy prices for the Germany industry competitive with rest of the world. Has Germany gone too far? Does this create market distortions that ultimately result in inefficiency? Will solar PV become economical going forward?

Third, unprecedented growth in Solar PV driven by plummeting prices and favorable policies. Prices have fallen by 75% since 2006. With FiT and tremendous government push there is a growth in renewables. Solar PV is already reaching 6% of total energy consumption which may not look high but quite high by global standards. Is this sustainable?

The impacts so far have been quite pronounced with noteworthy the future implications in the electricity sector.

First the positive. The big governmental push has helped the solar PV penetration and its ride down the cost curve. Solar PV has become economically competitive and a viable alternative in certain parts of the world with certain limitations. On the other hand, German utilities are certainly the bearing the biggest brunt. The so called big-four utilities – E.On, RWE, Vattenfall and enBW are all struggling due to reduction in production and fall in wholesale prices. However, customers are facing high retail prices despite low wholesale prices, due to recovery of FiT susbsidies. FiT’s initial estimate was $3-4 B but due to the rapid rise in deployments that number has escalated to about $10B annually and likely to remain that until 2020. Another issue is the slowdown in carbon emissions market that has resulted in a growth in lignite based generation to cover for baseload power reduction largely due to reduction in nuclear generation. The low costs of carbon emissions (~$10/ton)  do not curb polluting behavior, which requires at least $30/ton by typical estimates. Lignite based generation – which is by far one of the dirtiest forms of coal for atmospheric emissions – will likely stay unless the carbon markets become material again.  Indeed, lignite generation, has an uptick not just in Germany – but also in neighboring countries from Poland to Czech and Bulgaria. All have plans to build lignite plants, absent higher carbon pricing.

Is it misplaced policy or a series of masterful political maneuvers to keep the constituents happy? Lot of theories are floating around. Promotion of renewables but passing the costs onto the customers is clearly not always apparent. Placing the death knell on the nuclear industry and pandering to the public fear and outcry after Fukushima disaster. No doubt, the issues sound simple, but it is quite complex. Time will tell how the transition finally gets implemented and received, and we all get to witness the full spectrum of intended and unintended consequences.

Argy-bargy over disruption

Disruption is a word that is getting a lot of attention lately. First as an elegant theory it explained why well run companies failed. Then regardless how industries and companies failed, people started using it indiscriminately. Now the original coiner is now alarmed.  This month, Clay Christiansen has written an article in HBR – re-clarifying what disruption means and its rightful use.  Bhaskar Chakraborti, an expert in innovation from Tufts explains in an article the misuse issue is rampant and in a way why. His article links to several articles other articles in – New Yorker and Harry Lewis. Even The Economist has a take in it.

Words are known to evolve from their original definitions as language evolves, guided by changes in usage. Ask the purists and they will tell words like free-market or Marxism have been stretched and frequently used in very loose ways that often extend far enough to undermine its original meaning.

We after all live in the a world that does not run with the philosopher’s rigor in definitions. But what bothers the article’s authors is the fact that the term disruption is being used in cases when it is not a disruption according to their definition. Uber is not a disruption, for example, nor are many others as cited in the article.

There are two conditions that are necessary to qualify as a disruption- 1) disruption happens from lower end of the market or a new market and 2) it does not happen unless the quality catches up to the needs of the mainstream customers.

The authors argue that if these conditions are not made – then it is sustained innovation – e.g., Tesla’s breakthrough technology is just an innovation and indeed not a disruptive one since it is not addressing the low end of the market and the quality is already pretty high for what the mainstream customers would desire. So it is a distinct market segment that is looking for something different and is willing to pay the premium.

While I understand the theory and it is a neat one to distinguish the disruptive forces, I find it hard to understand and forecast disruptive innovation. In hindsight and after the fact, it is very easy to detect an innovation and term it as a disruptive innovation. But a priori I see this as just another strategic bet. For example, we can imagine battery storage can be potentially disruptive – if the chemical attributes can improve to meet the minimum thresholds for a economically viable battery. But how do I know when the next advances will happen or even ever happen.

I end up asking the same strategy question.  What do I have to believe the advances to be to qualify for a disruptive trajectory? The framework gives no insight.  If thats the case – do we really need the disruptive innovation framework? Or a plain vanilla strategy framework will do. Are disruption problems a separate class of problems or just another name for a subset of traditional strategy problems? I am leaning towards the latter.  As the economist points out, the term disruptive innovation may be too narrow,


Openness in Software

The New York times had a fascinating article on the state of car software and how hideously dangerous it can be.  Here is the article.  I find there is a parallel with the power system and a lesson to be learned, or for some reminded and reinforced.  As more software gets integral to the operations of the “digitized grid”, it is fair to assume that the situation will be quite similar.

Indeed there are tremendous benefits from these software and there is no reason to turn back. But the biggest problem is that we don’t know what is inside, what the code does. Problems that are emerging are of unprecedented proportions and enterprise threatening There is a level of opaqueness that makes this problem most concerning. We don’t have any idea what the code is doing behind the scenes. Most of these are closed and not known to the industry. Companies have historically cited proprietorship and treated these artifacts as a source of competitive advantage.  This will sound familiar to those who are wrestling for open standards and interoperability.  Open standards not only allow more transparency but it also allows for issues and problems to get detected and even solved. As complexity grows, relying on the manufacturer solely is not enough. Not that the manufacturers are not well intended but it is just too huge to have a systemic view and postulate all the possible cases, even the ones that are unintended cases. This is part of the silo effect that Gillian Tett explains in her book.

Openness in software cannot be just viewed as a threat to competitive advantage. Getting hacked, recalling products and systems to patch software and paying a fortune in fines or damage claims, whenever a violation or non-conformance comes to light are what can erode company profits and pose serious threats to its existence.

Companies across multiple industries need to think about the real consequences and heightened risks of closed systems. It is time that the old myths are done away and the antidotes to the challenges of the new reality instituted.

Utilities and Airlines

Utilities and airlines have several things in common:  They are highly regulated industries and in many countries still owned and operated by the government.  Structurally they are high fixed and low marginal cost businesses with high barriers to entry. If that’s the case there may be some similarities in how they might evolve. The airlines industry has evolved into a set of characteristics that have been quite controversial.  Deteriorating customer services, lower costs, and constant acrimony; yet there is no clear view of whether the downward slide can be checked.  Only certain theories exist.  Most recently, the New Yorker argued that the basic services are lowered so that there is deliberate misery created for customers to pay the high fees to avoid it.  This results in airlines earning high profits through fees in other services such as, baggage, drinks, and extra legroom seats. The economist adds

“High-fixed-cost, low-marginal-cost industries are characterized by brutal competition and punishing boom and bust cycles. Which is exactly what we see in the airline industry. Over the last 15 years, the three remaining major airlines—Delta, United and American Airlines—have averaged profit margins of 3 to 8 percent, with periodic dips into deep red. Things aren’t getting more crowded and fees higher because it’s a good way for them to shake a little more off the money tree. Rather, the only way that they can make any money is to schedule more flights, cram more seats into the planes and manage their yield so that the planes fly fuller. The result is unpleasantly reminiscent of cattle walking up the slaughterhouse chute. But unlike the cattle, we have to claim our own share of the responsibility. Ultimately, the reason airlines cram us into tiny seats and upcharge for everything is that we’re out there on Expedia and Kayak, shopping on exactly one dimension: the price of the flight. To win business, airlines have to deliver the absolute lowest fare. And the way to do that is…to cram us into tiny seats and upcharge for everything. If American consumers were willing to pay more for a better experience, they’d deliver it. We’re not, and they don’t.”

I don’t fully agree. In such industries there are major barriers to entry. In international markets, airlines have take different strategies to address different customer segements. There are many low cost players but Singapore airlines have taken a strategy to make customer satisfaction as the differentiator and customers are willing to pay for it. In the US, we dont have a well represented market. All US airlines (apart from low cost Southwest and others) are all playing a “me too” low cost game. We are stuck in a way, since companies like Singapore airlines can’t enter and operate in the US domestic market. This is a shame, especiallly for the bastion of capitalistism.

This makes me wonder. Given the similarities with utilities, if we reduce regulations and oversight, is there a possibility that the base service level actually decreases so that the companies can unbundle and make people pay for what was part of the entire package. For example, everytime we have to set up a new connection, can we start getting DIY kits and have to pay a ransom to get a technician out. It is possible. Hence, deregulation needs propoer foresight. Baly implemented policies can result in degreadation of services and a deadlock situation where the optimal point for ownership profits initiates a race to the bottom for customer services level, since the markets would not function well. Due to high risks and fixed costs, companies are reluctant to take the bet and enter into a new segment.