Antimicrobial Resistance Quietly Threatens Ethiopia’s Healthcare, Economy

Antimicrobial Resistance (AMR) is emerging as a silent crisis in Ethiopia. While striving to improve access to essential medicines, the country has inadvertently neglected a growing threat that could undermine decades of medical gains. Despite the significance of AMR as a global health crisis, health authorities’ response has been tepid, risking severe future repercussions for the healthcare system and economy.

Predictions from various studies suggest that, if unchecked, the AMR crisis could claim 10 million lives annually by 2050 and impose a global economic burden of about 100 trillion dollars. The impact of AMR extends beyond health, affecting socio-economic development across the continent, a risk acknowledged by the African Union (AU) in its endorsement of the African Common Position on Controlling Antimicrobial Resistance.

The stakes are particularly high in Ethiopia, a country with a population of over 100 million and a heavy burden of infectious diseases. However, several factors have constrained its leaders’ efforts to combat AMR.

The Ministry of Health (MoH) introduced a guideline in 2018 to establish an Antimicrobial Stewardship Program in hospitals, yet this program has not been widely implemented. Poor infection prevention practices among healthcare workers and the unregulated over-the-counter prescription of antibiotics continue to aggravate the situation. A considerable proportion of antibiotic prescriptions — up to half, according to some estimates — are unnecessary, uncovering the rampant misuse of these critical drugs.

The economic implications of AMR are also substantial. A study focusing on treating pneumococcal disease in children estimated the annual costs associated with AMR at around 15.8 million dollars. This includes 3.3 million dollars spent on ineffective first-line treatments, an additional 400,000 dollars on second-line treatments, and another 8.9 million dollars lost due to long-term productivity declines.

The national strategy for AMR is underdeveloped, and there is a notable lack of comprehensive data to understand the problem’s magnitude or monitor trends effectively. This data deficiency inhibits making informed decisions and implementing effective interventions.

The federal government has made strides in encouraging the domestic and overseas manufacture of pharmaceuticals, leading to a burgeoning market worth between 400 million dollars and half a billion dollars a year by 2016, growing by a quarter annually. Approximately 12 pharmaceutical companies and about 200 importers of pharmaceutical products and medical consumables were operational. While these efforts have been directed towards improving access to medicines, they inadvertently overlook the critical issue of AMR due to the excessive and inappropriate use of antimicrobials.

Addressing AMR requires various approaches. Gathering and analysing data to grasp the full extent of AMR and track emerging trends is essential. Such data is crucial for tailoring interventions and assessing their effectiveness. Substantial investments are needed to implement antimicrobial stewardship programs effectively. These programs should focus on improving infection prevention practices, updating therapeutic guidelines, enhancing the skills of health professionals, and educating the public about rational drug use.

Regulatory oversight is another critical component. Effective regulation can ensure that policies, strategies, and guidelines are well-crafted and faithfully executed. This includes tightening controls over the prescription of antibiotics in both public and private healthcare settings. The issue of AMR also demands a coordinated response across various sectors, not just the health sector. Federal health officials could prioritise advocacy and engage with other sectors directly impacted to address the AMR crisis comprehensively.

While some may argue that focusing on AMR might divert resources from efforts to achieve universal health coverage, it is essential to recognise that these objectives are not mutually exclusive. Implementing programs to counter AMR can go along with expanding access to healthcare. Indeed, the benefits of antimicrobial stewardship programs, which include preserving the effectiveness of existing drugs and reducing healthcare costs, far outweigh the initial resource allocation required to establish and maintain them.

It should be urgently recognised that Ethiopia faces daunting problems in addressing antimicrobial resistance. This challenge cannot be ignored. Without a meaningful effort to understand and combat AMR, its population risks a consequential setback to its health and economic progress. The time to act is now, with a clear focus on data collection, regulatory enforcement, and multisectoral cooperation to forge a sustainable response to a convoluted global health threat.

 

Fertilisers Will Not Fix Africa’s Food Crisis

The world is confronting an unprecedented food crisis, exacerbated by the COVID-19 pandemic, Russia’s war against Ukraine, and worsening climate conditions. However, the problem is most acute in Africa, where 61pc of the population faced moderate or severe food insecurity in 2022. At a moment when effective solutions are urgently needed, policymakers are once again coalescing around the misguided belief that increased use of mineral and synthetic fertiliser is the key to boosting agricultural productivity and ending hunger on the continent.

This approach can be traced back to the Abuja Declaration on Fertiliser for the Africa Green Revolution that African Union (AU) leaders endorsed in 2006. The goal was to reverse the continent’s poor yields by boosting fertiliser use from eight to 50Kg per hectare within a decade. Spearheading this effort was the Alliance for a Green Revolution in Africa (AGRA), an initiative backed by the Bill and Melinda Gates Foundation and other major donors. Working closely with large agribusinesses like the Norwegian-based chemical company Yara, AGRA championed the idea that distributing synthetic nitrogen fertiliser would solve Africa’s agricultural challenges.

This singular focus on synthetic fertiliser use has failed to address the complex realities of farming in Africa.

A recent assessment of AGRA’s projects in Burkina Faso and Ghana found no evidence that providing chemical inputs and high-yield seeds increased production and higher incomes for smallholder farmers. Instead, many are now more vulnerable and indebted after coming to rely on expensive synthetic pesticides and fertilisers, the prices of which soared following Russia’s invasion. These farmers have become locked in a cycle of dependency, while companies like Yara reap substantial profits.

Zambia is a good example. Despite being one of Africa’s largest consumers of synthetic nitrogen fertiliser, the country has not experienced a corresponding reduction in hunger and malnutrition. The view that more fertiliser means less hunger fails to address the systemic barriers to food security, such as affordability, and exacerbates existing challenges, such as soil degradation.

Specifically, synthetic nitrogen fertilisers disrupt the delicate balance of the soil ecosystem – the very foundation of sustainable agriculture. These inputs have been shown to reduce the abundance and diversity of beneficial microorganisms, such as mycorrhizal fungi, which are essential for nutrient cycling and plant health. When these symbiotic relationships are disrupted, soil resilience and fertility decline.

According to the World Bank, Africa loses around three percent of its GDP annually due to nutrient depletion and general soil degradation.

Excessive fertiliser use has far-reaching environmental consequences. It undermines crop productivity and thus devastatingly affects millions of smallholder farmers’ livelihoods and food security. It contributes to nitrogen pollution in water bodies, causes biodiversity loss in aquatic systems, and pushes the planet past safe limits for humans. Perhaps most worryingly, research indicates that the production and application of synthetic nitrogen fertiliser account for roughly two percent of total global greenhouse gas (GHG) emissions.

As a result, chemical companies like Yara are switching to “green fertilisers,” which are produced using hydrogen derived from renewable energy sources rather than fossil fuel-based inputs. This allows them to continue advocating for synthetic fertilisers as a solution to food insecurity in Africa (and, by extension, maintaining and expanding the market for their products), even as research points to the shortcomings of such an approach.

Using green hydrogen to produce fertiliser can indeed mitigate GHG emissions. However, while the production process may be less carbon-intensive, it is still highly energy-intensive. Applying fertiliser can release vast surges of nitrous oxide—a potent GHG—into the atmosphere and can still cause soil degradation and water pollution, regardless of how it is produced. By promoting “green fertiliser” as a panacea, the industry is engaging in greenwashing—using the veneer of sustainability to protect its interests.

Last week, the AU’s Africa Fertiliser & Soil Health Summit in Nairobi addressed soil degradation and food insecurity. The involvement of industry giants like Yara and organisations like AGRA suggests continued adherence to a flawed model that has consistently failed to alleviate hunger and malnutrition, a concern shared by the Alliance for Food Sovereignty in Africa, representing more than 200 million stakeholders. But instead of focusing on boosting short-term soil fertility, substituting one chemical with the other, and thus endorsing the fertiliser industry’s self-serving narratives, the summit should have considered longer-term goals, such as improving soil health and soil life, strengthening the resilience of farming communities, and ensuring the sustainability of food systems.

Productivity can be maintained without industrial nitrogen fertilisers, as shown in long-term trials across Africa. Alternatives include diversifying cropping systems, producing organic fertiliser, and planting legumes. Policymakers and stakeholders must move beyond the simplistic promotion of synthetic fertilisers, even those labelled as “green,” and embrace a more transparent and evidence-based approach. Only then can we truly address the root causes of hunger and malnutrition in Africa and worldwide.

 

A Healthcare Crisis Simmers as Millions Lack Vital Insurance Coverage

Home to over 100 million people, Ethiopia appears to be entangled in a healthcare paradox that contrasts sharply with models observed in countries like China and Kenya. The healthcare dilemma lies in the limited access to medical insurance, a reality that besets someone I know and who resigned from his municipal job seven years ago due to benign prostatic hyperplasia (BPH).

A father of three, his condition, which lingered untreated for years due to financial constraints, illustrates the dire state of healthcare. Once he received surgery, a procedure that took only 30 minutes, it revealed the inefficiencies and barriers within the current system. Medical costs are prohibitively high when weighed against average incomes. The existing insurance products often lack premium financing and innovation, making them inaccessible to the majority. Insurance coverage is a mere 0.3pc of the GDP, much lower than that of African counterparts such as Kenya, Egypt, Morocco, and South Africa.

A recent visit to Kenya showed me that the East African country has a much more vibrant medical insurance ecosystem, supported by a blend of private insurance options and government-run schemes like the National Health Insurance Fund (NHIF), integrated with Public-Private Partnership (PPP) services. The array of choices supports Kenyan citizens and empowers healthcare providers with a broader range of financial tools. However, the situation there is far from perfect.

Despite a more robust healthcare financing framework, Kenyan physicians were on strike during my visit, advocating for better compensation. They seemed determined to voice their misgivings on the broader regional challenges of ensuring that healthcare systems reach and adequately support the people.

No less acute is the financial struggle for healthcare professionals in Ethiopia. Monthly salaries could dip as low as 200 dollars, pushing specialists and subspecialists to sometimes use crowdfunding for their life-saving treatments. The climate of financial insecurity contributes to a pervasive sense of despair within the medical community.

Federal health officials could take a cue from global best practices by promptly implementing Social Health Insurance (SHI) to protect formal sector workers and centralising the Community Based Health Insurance (CBHI) pool to enhance service delivery. Such measures could improve the financial health of hospitals, which in turn would address the long waiting periods for receivables and integrate more efficiently with private service providers.

The private sector also has a critical role to play. Moving beyond traditional fee-for-service models to introduce prepaid healthcare services could align more closely with global healthcare trends. There is room for innovation in the insurance products offered.

Our efforts in this domain are gaining momentum. The National Bank of Ethiopia’s focus on enhancing health and life insurance leadership is promising.

We recently won recognition in the FSD Africa’s Bimalab’s 2023 Pan African InsurTech Accelerator Program, alongside Kacha, for designing a successful healthcare premium financing pilot. This initiative is part of a broader transition towards establishing Family Tena Life Insurance, set to become Ethiopia’s first dedicated health and life insurer. This collaboration among hospitals, medical professionals, and passionate advocates aims to make healthcare available and affordable.

These initiatives should be more than business development. They should assist in building a more sustainable healthcare system that enhances the security and quality of life for all Ethiopians. By improving the financial underpinnings of healthcare, Ethiopia can better equip its medical professionals, enhance access to basic facilities, and address the risks associated with healthcare needs.

Indeed, the economic implications of these healthcare reforms extend beyond the immediate benefits to patients and providers. By reducing the need for healthcare tourism through investments in local tertiary care facilities funded by private pension contributions, Ethiopia could retain considerable financial resources within its borders. Such strategic shifts in healthcare financing and insurance could pave the way for healthier, more prosperous Ethiopians.

The path forward is not merely advisable but essential. As the experiences of countless show, the cost of inaction is measured in human lives and missed economic opportunities. Through collaborative efforts between the public and private sectors, Ethiopia has the potential to transform its healthcare ecosystem, ensuring that all its citizens have access to the care they need and deserve.

 

What Mission-Driven Government Means

The COVID-19 pandemic, inflation, and wars have alerted governments to the realities of what it takes to tackle massive crises. In extraordinary times, policymakers often rediscover their capacity for bold decision-making. The rapid speed of COVID-19 vaccine development and deployment was a case in point.

But, preparing for other challenges requires more sustained efforts in “mission-driven government.” Recalling the successful language and strategies of the Cold War-era moonshot, governments worldwide are experimenting with ambitious policy programs and public-private partnerships in pursuit of specific social, economic, and environmental goals. For example, the Labour Party’s five-mission campaign platform in the United Kingdom has kicked off a vibrant debate about whether and how to create a “mission economy”.

Mission-driven government is not about achieving doctrinal adherence to some original set of ideas; it is about identifying the essential components of missions and accepting that different countries might need different approaches. As matters stand, the emerging landscape of public missions is characterised by a re-labelling or repurposing of existing institutions and policies, with more stuttering starts than rapid takeoffs.

But that is okay. We should not expect a radical change in policymaking strategies to happen overnight, or even over one electoral cycle.

Particularly in liberal democracies, ambitious change requires engagement across a wide range of constituencies to secure public buy-in, and to ensure that the benefits will be widely shared. The paradox at the heart of mission-driven government is that it pursues ambitious, clearly articulated policy goals through myriad policies and programs based on experimentation.

This embrace of experimentation is what separates today’s missions from the missions of the moonshot era (though it does echo the Roosevelt administration’s experimental approach during the 1930s New Deal). Major societal challenges, such as the urgent need to create more equitable and sustainable food systems, cannot be tackled the same way as a moon landing. Such systems consist of multiple technological dimensions (in the case of food, these include everything from energy to waste management), and involve widespread and often disconnected agents and an array of cultural norms, values, and habits.

Transforming such complex systems requires a portfolio of programs aimed at a common goal, not a strategy dictating how each sector or enterprise should solve its respective part of the challenge. Rather than trying to conceptualise the complexity away, today’s successful missions will make it central to policymaking.

Success thus depends on understanding what missions are not supposed to be. For starters, missions are not top-down planning exercises directed by omniscient policymakers. The process relies on entrepreneurial discovery and competition in the private sector to push along the experimentation needed to figure out which solutions work.

Nor are missions synonymous with industrial policy, but they can (and arguably should) shape such policies and clarify their purposes or success metrics.

For example, what does a policy to boost competitiveness mean? Are we talking about increasing productivity, exports, and GDP, or about wages and more sustainable forms of growth?

The latter would require a mission directive because markets on their own would not necessarily deliver the intended outcomes. Missions are not only about science, technology, and innovation policies. Investing in high-quality education and basic research does not require a mission. We already know that doing so yields broad social and economic benefits. But when we want education and research to help us address a specific challenge, we need a mission.

For example, if the UK hopes to leverage its innovation system to tackle inequality, it must ensure that funding contributes to the diversity of what is being studied, researched, or developed.

Likewise, overall growth is not a mission. Of course, missions can encourage cross-sectoral collaboration, innovation, and investments to pursue a single goal, thereby generating technological spillovers, contributing to productivity and job creation, and ultimately generating economic growth. But, reciprocity must be built into contracts: subsidies, loans, and guarantees should be conditional on the business sector investing in innovation, leading to better (more inclusive and sustainable) production and distribution systems.

For example, the US CHIPS and Science Act requires semiconductor companies that receive public funds to reinvest profits (instead of buying back their shares) in improved working conditions and energy-efficient supply chains. When properly structured in this way, missions can have a multiplier effect, generating greater business investment and ultimately boosting GDP more for every dollar invested.

Simply agreeing on ambitious, societally relevant goals is not enough. Missions require a fundamental rethinking of policymaking tools and processes. Yes, prescribing specific solutions, building Gantt (project-management) charts, and layering on hefty reporting requirements will not excite anybody. But it is equally true that providing open-ended, no-string-attached subsidies to businesses will not produce the kind of growth we want, nor will it serve the common good.

Missions require significant investment in the public sector’s capacity. Without this, we will always hear that mission-oriented government is a pipe dream – precisely the argument used to justify years of outsourcing to private consultants.

The less we believe that governments can do anything other than fix market failures, the less we will invest in the public sector’s broader potential. While it is not easy to direct innovation through outcomes-oriented policies, bottom-up innovation across sectors, and inter-ministerial processes, it is possible. The problem is that we remember this only during wars or crises. We founded the UCL Institute for Innovation & Public Purpose to change how outcomes-oriented civil service is perceived, and to put “new economic thinking” about market-shaping policies to real-world use.

From Australia and Sweden to Brazil, there are great examples of innovation agencies experimenting with new ways of working: testing solutions through pilot projects and incorporating successful programs into larger portfolios of interventions. These efforts have also required organisational innovations, from creating new roles to fostering new management cultures.

Mission-driven government is critical to achieving sustainable and inclusive economic growth and tackling the big challenges countries face. It does not need to follow a fixed path, but it does call for fundamental changes to how governments work and greater investment in public-sector capabilities.

 

Unhooking the Liquid Gold Taboo

Motherhood has a way of unveiling unexpected truths. Following the birth of my daughter, I was confronted with a striking realisation: the prevalence of misinformation and stigma surrounding breastfeeding. It was a realm where I encountered surprising perspectives.

Many older women, whom I had presumed to be wellsprings of wisdom due to their experiences raising children, offered unsolicited advice steeped in outdated beliefs. They championed the perceived superiority of formula and voiced concerns that I was denying my daughter essential nutrients by exclusively breastfeeding. Their opinions disregarded evidence-based recommendations from credible sources like the World Health Organisation (WHO) and the Center for Disease Control (CDC). On top of that, they deemed public breastfeeding, even when discreetly covered, as taboo—an attitude that left me bewildered.

Nevertheless, I made the informed choice to exclusively breastfeed for the initial six months, supplemented until my daughter reaches two years old. This decision came with sacrifices; I put my career on hold while my husband took on additional work. But, the rewards are immeasurable. We are fortunate to provide our child with vital nutrients.

In Ethiopia, where workplace support for breastfeeding mothers is lacking, many women are compelled to cease the practice due to practical constraints and societal pressures. I have even witnessed a close relative from a privileged background being discouraged from breastfeeding, as it was deemed suitable only for the underprivileged. She now deeply regrets depriving her son of the advantages of breast milk.

The manifold benefits extend beyond physical health, encompassing cognitive, emotional, and immunological enrichment that forms the bedrock of lifelong well-being. It helps form a profound emotional and psychological bond between mother and child. Directly providing nourishment and sustenance instils a profound sense of accomplishment and capability.

Breastfeeding redirects attention from societal beauty ideals toward the empowerment found in nurturing and sustaining life. Its immune-boosting attributes fortify the infant’s immune system and emotional needs. It compels a mother to feed her baby every two to three hours, honing her ability to discern her baby’s cues. Her proficiency in recognising subtle signs and signals indicating the infant’s needs will be heightened. The cognitive, emotional, and immunological enrichment in children transcends the creation of a healthy family and societal dynamic.

Although the benefits of breastfeeding for both mothers and babies are undeniable, Ethiopian women find it difficult to maintain upon returning to work. Regrettably, many offices even in Addis Abeba lack dedicated facilities for breastfeeding. While the city’s Women, Children & Social Affairs Bureau endeavoured to allocate space for childcare services in up to 1,000 public offices last year, its implementation requires earnest attention.

Instituting workplace policies that afford nursing mothers dedicated spaces, flexible breaks for pumping, and safe storage facilities enables mothers to nurture their children while fulfilling their professional obligations. Frequent pumping at work helps mothers maintain their milk supply, which operates on a demand-and-supply basis. Meanwhile, educating the public and creating a space where mothers feel confident breastfeeding anywhere without fear of judgment or criticism is imperative to dispel misconceptions and barriers.

Investing in breastfeeding today is an investment in the health, vitality, and prosperity of generations to come. Providing infants with the optimal nutrition lays the groundwork for a generation that has stronger immune systems enhanced cognitive development, and emotional stability. It contributes to the emergence of a thriving populace.

Breaking the Persistent Venting Cycle

A lengthy phone call with a close friend recently left me surprised. She brought up a familiar issue, seeking advice even though we had discussed it before. I recall that we had talked it through, reaching a seeming resolution about a year ago. However, the same frustration resurfaced.

Initially, I assumed she just needed to get it out of her system, but mid-conversation realised that simply acknowledging her struggles could be the emotional support needed to move forward. After she vented for what felt like an hour, she asked for my perspective.

While venting can be a valuable release valve, like spilling a steaming cup of tea to cool down, it does not necessarily translate to a desire for solutions. Sometimes, we simply crave the comfort and validation of our struggles. However, dwelling in this cycle can be like pushing the snooze button on a life alarm – it might offer temporary relief, but the underlying issues remain unaddressed.

Understanding why someone revisits problems can be tricky. Unforeseen circumstances, external pressures, or internal anxieties might be at play. The key is to shift the conversation from venting to exploring solutions. Fear of change and failure can be paralysing. Discussing the problem becomes a way to avoid that intimidating first step into the unknown. Complaining acts as a pressure valve, temporarily releasing frustration. In a world overflowing with uncertainties, negativity becomes a strange comfort, a way to connect with others through shared grievances.

Change is daunting, with its potential to disrupt routines and comfort zones. Chronic complainers, especially, may find solace in the familiar negativity, a well-worn path compared to the uncharted territory of solutions. Blaming external factors allows them to avoid personal responsibility and the effort required for change. This “victim mentality” shields them from the perceived risks of failure, but ultimately hinders progress. Stuck in this negativity bubble, they miss out on opportunities for growth and improvement.

Recognising the limitations of simply listening, I knew I needed a more proactive approach. Drawing on what psychologists refer to as “motivational interviewing”, I began asking open-ended questions that gently nudged my friend towards exploring solutions. It is a technique used by therapists and others to help people explore and strengthen their motivation for change.

I asked open-ended questions that encouraged reflection and self-discovery. Shifting focus to answer questions about whether she considered approaching it from a different angle and what her version of the ideal outcome would be, helped her shift focus from frustration to possibility.

It was better to be solution-focused. The prospect of addressing overwhelming problems can be difficult, so I reassured my friend that my support extended beyond simply listening. We broke down the issue into smaller, achievable steps. I believe celebrating each small victory along the way would build momentum and a sense of accomplishment, ultimately propelling her forward in overcoming her challenges.

Setting boundaries is crucial in a supportive role. Offering a listening ear for 20 minutes and then gently suggesting brainstorming solutions together shows respect for both time and progress. The ultimate goal is to move beyond the cycle of venting and guide individuals toward taking tangible actions. By being a supportive presence, promoting potential solutions, and setting boundaries, I learned that it is possible to empower friends to discover long-lasting resolutions and achieve the outcomes they desire.

Tax Authorities’ Sisyphean Saga of Shrinking Revenue, Gov’t Soaring Ambitions

Our subscribers to the print edition are entitled to get a bonus in a form of early access to our digital edition.Use the bank detail below or call our office at +251-011-416-3020 to subscribe – only 657.00 Br for 52 editions – and enjoy access to www.addisfortune.news beginning on SUNDAYS as early as 6:00am!

Soaring Hidden Costs Shine Behind Solar, Wind Power

The cost analysis of solar and wind energy often omits a critical factor: reliability. While solar energy might cost less than natural gas, the marginal difference does not factor into the condition of ‘when the sun is shining’. Incorporating reliability costs can skyrocket the price of solar energy, suggesting an 11-42 times increase, argued Bjorn Lomborg (PhD), president of the Copenhagen Consensus and a visiting fellow at Stanford University’s Hoover Institution.

Despite constantly being told that solar and wind are now the cheapest forms of electricity, governments across the world needed to spend 1.8 trillion dollars on the green transition last year. US President Joe Biden conveniently justified spending hundreds of billions of dollars on green subsidies, claiming “wind and solar are already significantly cheaper than coal and oil.” Indeed, arguing that wind and solar are the cheapest is a meme employed by green lobbyists, activists and politicians worldwide.

Unfortunately, the claim is wildly deceptive.

Wind and solar energy only produce power when the sun is shining, or the wind is blowing. Their electricity is infinitely expensive, and a backup system is needed all the rest of the time. This is why global electricity remains almost two-thirds reliant on fossil fuels—and why, on current trends, we are an entire century away from eliminating fossil fuels from electricity generation. The first reason the cheapest electricity claim is wrong would be the intermittency of green energy.

Imagine a solar-driven car launched tomorrow, running cheaper than a gas vehicle. It seems alluring—not until we realise that the car would not run at night or when it is overcast. We still need a gas vehicle as a backup and would have to pay for two cars. That is what happens with renewable energy.

Modern societies need power 24/7. Unreliable and intermittent solar and wind bring enormous, often hidden costs. This is a minor problem for wealthy countries that have already built fossil power plants and can use more of them as backups. However, it will make electricity more expensive, as intermittent renewables make everything else intermittent, too.

But in the poorest, electricity-starved countries, there is little fossil fuel energy infrastructure. Hypocritical wealthy countries refuse to fund sorely needed fossil fuel energy in the developing world. Instead, they insist that people cope with unreliable green energy supplies that cannot power water pumps or agricultural machinery to lift populations out of poverty.

It is often reported that extensive and emerging industrial powers like China, India, Indonesia and Bangladesh are getting more energy from solar and wind. But these countries get much more additional power from coal. Last year, China got more additional power from coal than it did from solar and wind. India got three times as much, Bangladesh got 13 times more coal electricity than it did from green energy sources, and Indonesia an astonishing 90 times more. If solar and wind were cheaper, why would these countries miss out? Because reliability matters.

The typical way to measure the cost of solar ignores its unreliability and tells us the price of solar energy “when the sun is shining.” The same is valid for wind energy. That does indeed make their cost slightly lower than any other electricity source. The US Energy Information Administration puts solar at 3.6¢ per kWh, just ahead of natural gas at 3.8¢. But suppose the cost of reliability is included. In that case, the actual costs explode — one peer-reviewed study shows an increase of 11-42 times, making solar the most expensive source of electricity, followed by wind.

The enormous additional cost comes from the need for storage. Electricity is required even when the sun is not shining, and the wind is not blowing, yet our battery capacity is woefully inadequate.

Research shows that every winter, when solar energy contributes very little, Germany has a “wind drought” of five days, during which wind turbines also deliver almost nothing. That suggests batteries will be needed for a minimum of 120 hours—although the actual need will be much longer since droughts sometimes last much longer and recur before storage can be filled.

A new study looking at the United States shows that to achieve 100pc solar or wind electricity with sufficient backup, the US would need to be able to store almost three months’ worth of annual electricity. It currently has seven minutes of battery storage. The US would have to pay five times its current GDP to buy the batteries. And it would have to repurchase the batteries when they expire after 15 years.

Globally, the cost of having sufficient batteries would be 10 times the global GDP, with a new bill every 15 years.

There is a second reason why the claim is incorrect. It leaves out the cost of recycling spent on wind turbine blades and exhausted solar panels. A small town in Texas is overflowing with thousands of enormous blades that cannot be recycled. In poor African countries, solar panels and their batteries are already being dumped, leaking toxic chemicals into the soil and water supplies. Because of lifetimes lasting just a few decades and pressure from the climate lobby for an enormous ramp-up in use, this will only get much worse. A study shows that this trash cost alone doubles the actual cost of solar.

If solar and wind were cheaper, they would replace fossil fuels without the need for a grand push from politicians and the industry. The claim is incessantly repeated because it is convenient. We must invest much more in low-CO energy research and development (R&D) to fix climate change. Only a substantial boost in R&D can bring about the necessary technological breakthroughs—reducing trash, improving battery storage and efficiency, and other technologies like modular nuclear—that will make low-CO energy sources cheaper than fossil fuels.

Until then, claims that fossil fuels are already outcompeted are just wishful thinking.

The Civil Service “Do More With Less” Dilemma

As governments strive to meet the complex demands of the 21st century, adopting innovative management strategies, technological advancements, and a collaborative approach with the private sector will be crucial. In a think-piece edited here to a short form, Scott Blackburn, a senior partner of McKinsey & Company (Washington DC), Andrew Pickergill, a senior partner (Toronto), and Jorg Schubert, a partner (Dubai), have argued for a renewed focus on customer service and productivity to enhance the ability of public sector agencies to serve the public more effectively, cultivating a more resilient and prosperous society.

Governments worldwide are at a crossroads, facing increasing pressure to enhance public trust and improve service delivery despite a complex array of socio-economic challenges. In response, many are turning to innovative management strategies and technological advancements to transform their operations and fulfil their mission of serving the public more effectively.

Public sector agencies play a vital role in societal well-being, driving scientific innovation and economic growth and providing security and hope during times of crisis. The professionalism and dedication of civil servants are critical in these efforts, touching nearly every aspect of daily life and significantly impacting the quality and longevity of citizens’ lives, their livelihoods, and community resilience.

However, the challenges facing these committed professionals are mounting.

Declining trust in government, demands for fiscal efficiency, rapid technological changes, political polarisation, and geopolitical tensions are just some of the hurdles that complicate the delivery of public services. The urgent need for sustainable and inclusive growth places further strain on these agencies as they strive to meet the rising public expectations for better customer service and more timely and budget-conscious technology and infrastructure projects.

Despite these obstacles, some public sector organisations are making meaningful strides. By adopting new technologies and management approaches and encouraging effective collaborations with the private sector, these agencies are beginning to deliver services more efficiently and effectively. For instance, improving customer service has proven to boost public trust immensely; in the United States, it has been found that every percentage-point increase in customer satisfaction with a federal agency boosts trust in that agency by as much as two percentage points.

There is also a substantial opportunity for productivity improvement within government operations. Estimates suggest that there could be a productivity improvement opportunity ranging from 725 billion dollars to 765 billion dollars in the U.S. government alone, which could translate to roughly 2,000 dollars for every American citizen. If governments worldwide were to enhance their productivity to match that of their most improved peers, global productivity could potentially increase by up to 3.5 trillion dollars.

Innovation is another area where governments can make a profound impact. They can catalyse sustainable economic growth by adopting an investor mindset and partnering with the private sector. For example, the U.S. government’s collaboration with private payers to create markets like Medicare Advantage and initiatives like the Inflation Reduction Act (IRA) help de-risk investments in social goods.

Despite some progress in service delivery, many government services still fall short of public expectations. In the United States, Americans rank state and federal government services below those provided by airlines, banks, and car insurers. Satisfied government service users are nine times more likely than others to trust the providing agency and believe it is achieving its mission. Conversely, dissatisfied users are twice as likely to seek help multiple times, increasing costs and time for agencies.

The private sector often sets the benchmark for customer service, and public agencies could learn much from this sector. Focusing on the customer journey could enhance service delivery considerably. Establishing a link to value, such as reducing wait times or improving access, and identifying opportunities to reimagine the user journey are essential steps in this process. Leveraging technology can be a game changer, although governments have historically underinvested in this area. Research conducted with Oxford Global Projects indicates that only one in 200 IT projects in the public sector deliver their intended benefits on time and within budget.

Yet, measuring productivity in the public sector can be challenging since traditional metrics of inputs and outputs used in the private sector do not always apply. Nonetheless, capturing cost efficiency, quality outcomes, and service throughput can spur productivity improvements. Apart from policymaking, other governmental functions like finance, purchasing, technology, and talent management could be leveraged more effectively to pursue efficiency and improve outcomes.

The COVID-19 pandemic demonstrated the potential for governments to enhance their productivity, with public expenditures reaching historic highs as a share of GDP during the crisis. This has created an environment ripe for productivity improvements across government departments and agencies. Our recent research uncovered seven productivity levers, ranging from digital transformation and organisational optimisation to innovative sourcing and managing customer demand through self-service options.

However, transformations in government productivity often require a comprehensive approach, and our experience indicates that only a small fraction of public sector transformations meet their objectives. To capture new opportunities, governments may need to break free from institutional habits that limit transparency and favour incrementalism. Successful transformations depend on setting bold goals, mobilising the best talent to drive initiatives, fostering smart risk-taking, and investing in the right tools and systems.

Sustainable and inclusive economic growth provides the resources needed to address today’s pressing challenges. Governments are crucial in creating environments that support private sector success and equitable competition. Recent legislative actions such as the CHIPS and Science Act in the United States and the European Chips Act in the European Union reflect a renewed focus on industrial strategy, supporting key industries to secure competitive advantages and manage transitions like that of energy more effectively.

To successfully navigate this evolving terrain, government leaders might benefit from adopting more of an investor mindset, which can help balance competitive conditions for businesses and foster environments conducive to growth and innovation. Direct consultations with stakeholders should inform this approach to effectively guide decision-making and investment strategies. The skills required of modern civil servants are expanding.

Beyond traditional policy design, today’s government workers need proficiency in managing large organisations and leading complex transformational programs, often underpinned by technology. Given the ongoing public sector talent shortage and the digital skills gap, governments must adapt their workforce strategies to meet these demands. This involves refreshing the employee value proposition, implementing strategic workforce plans, and promoting a more inclusive and diverse organisational culture.

History Already Tells Us the Future of AI

Artificial intelligence (AI) and its threat to good jobs would seem to be an entirely new problem. But we can find useful ideas about how to respond in the work of David Ricardo, a founder of modern economics who observed the British Industrial Revolution firsthand. The evolution of his thinking, including some points he missed, holds many helpful lessons for us today. Private-sector tech leaders promise us a brighter future of less street at work, fewer boring meetings, more leisure time, and perhaps even a universal basic income.

But should we believe them?

Many people may lose what they regarded as a good job – forcing them to find work at a lower wage. After all, algorithms are already taking over tasks that require people’s time and attention. In his seminal 1817 work, “On Principles of Political Economy and Taxation,” Ricardo took a positive view of the machinery that had already transformed cotton spinning. Following the conventional wisdom of the time, he famously told the House of Commons that “machinery did not lessen the demand for labour.”

Since the 1770s, the automation of spinning has reduced the price of spun cotton and increased demand for the complementary task of weaving spun cotton into finished cloth. Since almost all weaving was done by hand before the 1810s, this explosion in demand helped turn cotton handweaving into a high-paying artisanal job employing several hundred thousand British men (including many displaced, pre-industrial spinners). This early, positive experience with automation likely informed Ricardo’s initially optimistic view.

But, the development of large-scale machinery did not stop with spinning. Soon, steam-powered looms were being deployed in cotton-weaving factories. No longer would artisanal “hand weavers” be making good money working five days per week from their cottages. Instead, they would struggle to feed their families while working much longer hours under strict factory discipline. As anxiety and protests spread across northern England, Ricardo changed his mind. In the third edition of his influential book, published in 1821, he added a new chapter, “On Machinery,” where he hit the nail on the head: “If machinery could do all the work that labour now does, there would be no demand for labour.”

The same concern applies today. Algorithms’ takeover of tasks previously performed by workers will not be good news for displaced workers unless they can find well-paid new tasks.

Most of the struggling handweaving artisans during the 1810s and 1820s did not go to work in the new weaving factories, because the machine looms did not need many workers. Whereas spinning automation had created opportunities for more people to work as weavers, the automation of weaving did not create compensatory labour demand in other sectors. The British economy overall did not create enough other well-paying new jobs, at least not until railways took off in the 1830s. With few other options, hundreds of thousands of hand weavers remained in the occupation, even as wages fell by more than half.

Another key problem, albeit not one that Ricardo himself dwelled upon, was that working in harsh factory conditions – becoming a small cog in the employer-controlled “satanic mills” of the early 1800s – was unappealing to handloom weavers. Many artisanal weavers operated as independent businesspeople and entrepreneurs who bought spun cotton and sold their woven products on the market. They were not enthusiastic about submitting to longer hours, more discipline, less autonomy, and typically lower wages (at least compared to the heyday of handloom weaving). In testimony collected by various Royal Commissions, weavers spoke bitterly about their refusal to accept such working conditions or how horrible their lives became when they were forced (by the lack of other options) into such jobs.

Today’s generative AI has enormous potential and has already chalked up some impressive achievements, including in scientific research. It could well be used to help workers become more informed, more productive, more independent, and more versatile. Unfortunately, the tech industry seems to have other uses in mind. As we explain in “Power & Progress”, the big companies developing and deploying AI overwhelmingly favour automation (replacing people) over augmentation (making people more productive).

That means we face the risk of excessive automation: Many workers will be displaced, and those who remain employed will be subjected to increasingly demeaning forms of surveillance and control. The principle of “automate first and ask questions later” requires – and thus further encourages – the collection of massive amounts of information in the workplace and across all parts of society, calling into question how much privacy will remain.

Such a future is not inevitable. Regulation of data collection would help protect privacy, and stronger workplace rules could prevent the worst aspects of AI-based surveillance. But the more fundamental task, Ricardo would remind us, is to change the overall narrative about AI. Arguably, the most important lesson from his life and work is that machines are not necessarily good or bad. Whether they destroy or create jobs depends on how we deploy them and who makes those choices. In Ricardo’s time, a small cadre of factory owners decided, and those decisions centred on automation and squeezing workers as hard as possible.

Today, an even smaller cadre of tech leaders seems to be taking the same path. However, focusing on creating new opportunities, new tasks for humans, and respect for all individuals would ensure much better outcomes. It is still possible to have pro-worker AI, but only if we can change the direction of innovation in the tech industry and introduce new regulations and institutions.

As in Ricardo’s day, trusting in the benevolence of business and tech leaders would be naive. It took major political reforms to create genuine democracy, legalise trade unions, and change the direction of technological progress in Britain during the Industrial Revolution. The same fundamental challenge confronts us today.

Daron Acemoglu is an institute professor of economics at MIT, and Simon Johnson, a former chief economist at the IMF), is a professor at the MIT Sloan School of Management and a co-author (with Daron Acemoglu) of “Power & Progress: Our Thiusand-Year Struggle Over Technology & Prosperity.” This article is provided by Project Syndicate (PS).