Rethinking Research: Time for a Paradigm Shift
Thousands of scientists publish a paper every five days. Is it good news?
Theories of scientific and technological change suggest that accumulated knowledge enables future progress. However, despite exponential growth in new scientific and technological knowledge, progress is slowing in several major fields. Something that didn’t happen last century, in which we witnessed an unprecedented expansion of scientific and technological knowledge.
A study from Nature has analyzed 45 million papers and 3.9 million patents from six large-scale datasets to characterize how papers and patents change networks of citations in science and technology. This study found that papers and patents are increasingly less likely to break with the past in ways that push science and technology in new directions. These results might suggest a fundamental shift in the nature of science and technology.
This study has created a lot of controversy in the research community, including a lot of critical and emotional responses from big educational and scientific societies (like IEEE and ACM).
As someone who straddles between the worlds of research and innovation, I feel compelled to step up. But instead of engaging in a pointless battle between researchers and Nature, creating more controversy and adding more fuel to the fire, let's consider all arguments launched by both parties and bring some critical new points that have been left out of the discussion.
Different Words, Different Meanings
1. Research vs. Innovation
It's a common source of confusion for many people to know the difference between research and innovation, so it could be useful if we can define both terms properly. According to the Cambridge Dictionary:
- Research: a detailed study of a subject, especially to discover (new) information or reach a (new) understanding.
- Innovation: (the use of) a new idea or method
Voila!
Research is all about digging deep into a subject to uncover new information, while innovation involves using a fresh approach or idea to create something entirely new. It's clear to see that these two concepts are entirely different beasts, each requiring a unique set of skills to master.
If you're a researcher at heart, you'll need to be able to think outside the box, connecting seemingly disparate ideas to create something new and worthwhile. You'll also need to be able to communicate your findings in a way that's easy for others to understand.
However, if you're more of an innovator (entrepreneur), you'll need to have a doer mindset, someone who's willing to take risks and turn those abstract ideas into something tangible. This includes not just creating a new product or service, but also marketing it effectively and ensuring it's financially sustainable in the long run (something researchers usually don’t care about).
While engineers, technologists, and entrepreneurs may have this business-oriented mindset, scientists and philosophers would agree that it's in our nature to seek answers to fundamental questions. And this pursuit may not always result in a tangible product but it's an essential part of our human experience. This very interesting philosophical topic deserves a dedicated discussion another time.
Ultimately, innovation is the final frontier, the end goal that provides real value and impact to society. So, whether you're a researcher or an entrepreneur, humanity needs both kinds of people to move forward; push the boundaries, and find a way for new and exciting discoveries.
2. Disruption Is Not Innovation (Or Research)
Innovation is often associated with disruption, but is that the only goal of research?
While disrupting an industry or market with a new concept can be game-changing, it's not the only way to make progress. Indeed, some of the most impactful technologies and discoveries took years, if not decades, to develop.
It's important to remember that research is a long-distance race, not a sprint. By testing ideas, designing experiments, and gathering data, researchers can make incremental progress toward their objectives. And while some discoveries may not have immediate applications, they can pave the way for future breakthroughs. Innovation is never a single event but a long process and it is almost never achieved by one person or organization:
- Marie Curie's groundbreaking discovery of radio set the foundation for the development of radiotherapy, which was not realized until many years later. Nowadays, we can all appreciate the immense benefits of this pioneering research every time we enter an X-ray room to make sure our bones are healthy and strong.
- Alexander Fleming discovered penicillin in 1928, but it wasn’t until 15 years later, in 1943, that the miracle drug came into widespread use. The reason why Fleming was unable to bring Penicillin to market was because he lacked the necessary skills (he was a biologist). It wasn’t until a decade later that two chemists, Howard Florey and Ernst Boris Chain, took up the challenge and were able to synthesize penicillin. Even then, it took people with additional expertise in fermentation and manufacturing to turn it into the miracle cure we know today.
- Alan Turing came up with the idea of a universal computer in 1936, but it wasn’t until 1946 that the first working model was built. And it took until the 1990s for computers to significantly impact productivity metrics.
- Steven Sasson invented the digital camera in 1975 when he was working for Kodak on an innovation project, but he was forced to keep it hidden. Strangely enough, 37 years later, Kodak declared itself bankrupt. This was partly attributable to Instagram, which fully disrupted Kodak's business model (which depended on scarce physical photo rolls), while Instagram's success was founded on the abundance of digital photos. Irony at its finest, wouldn't you say?
Another recent example is artificial intelligence (AI), which is on everyone's minds today:
- AI algorithms were first developed over two centuries ago by Adrien-Marie Legendre and Carl Friedrich Gauss, who used regression to determine the orbits of bodies around the sun in 1805. Later, Warren McCulloch and Walter Pitts created a conceptual model of a neural network in 1943.
- The creation of the World Wide Web by CERN in 1991 paved the way for an explosion in the amount of data available to us. Fast forward to 2023, and we are now experiencing an incredible influx of information from over 14.4 billion internet-connected devices, generating a staggering 2.5 quintillion bytes of data every day. The sheer volume of data available to us is changing the way we live, work, and interact with the world around us.
- Thanks to exponential improvements in computational power and storage, the sky's the limit for what AI can achieve. With the power of the cloud, we're seeing massive strides in the field of AI, from natural language processing to image recognition. This progress has been fueled in part by Moore's Law, which predicts that computing power doubles roughly every two years, and Kryder's Law, which states that data storage density doubles roughly every 18 months.
With these three trends intertwining and accelerating each other at an unprecedented rate, the possibilities for AI are truly endless.
The point is, progress takes time and patience (something that researchers have plenty of), as well as the combination of different disciplines and technologies. And research is often the glue that holds it all together. So, while disruption may be a consequence of innovation, it's not the only end goal. Instead, let's appreciate the power of research to drive progress, even if it takes years or decades to come to fruition.
Measure What Matters
1. Research metrics
Are you tired of meaningless metrics? You're not alone. In today's business world, companies are quick to create metrics that look good on paper but fail to provide any meaningful insights. That's why it's important to use metrics that make sense and help you reach your objectives. Because you are what you measure. Sadly, research is no exception to this rule.
And don't be fooled into thinking that metrics are your end goal. When you start focusing solely on metrics instead of measuring how well you're doing against your objectives, you're heading for disaster. It's a common mistake, even for top advisory firms.
"When a measure becomes a target, it ceases to be a good measure" - Goodhart's law
Unfortunately, many of the metrics currently used in research are just vanity metrics. They might make you look good to others, but they don't help you understand your performance or how close you are to achieving your goals. For example, "the number of patents, number of publications, and number of journals in a field of study" is no indication whatsoever of your innovation or impact on society. They might show that you're good at writing new articles or applying old concepts to new fields, but they're worthless if you want to measure your true innovation.
And to make matters worse, most of these metrics are lagging metrics, meaning they only measure your past performance and provide no indication of what the future looks like.
This doesn't mean that you should have no metrics -as Peter Drucker pointed out, "You can't improve what you don't measure."-, but it is crucial to track the right metrics and adjust them when you are not achieving the expected outcomes. Metrics are not written in stone.
Microsoft is a clear example of this. When Satya Nadella took the helm at Microsoft in 2014, he realized that a massive change was needed to shift from a license model to a subscription model (along with the decision to challenge Amazon in the cloud). But he knew it wouldn't be easy. That's why he decided to change the KPIs of every employee in the company, recognizing that changing employee behavior required changing the way they were measured. Over three years, every legacy KPI was gradually realigned with Nadella's new strategy. Today, Microsoft's mobile-first, cloud-first strategy has revolutionized the company, making it one of the biggest technology firms in the world.
Therefore, if you want to truly understand your progress and achieve your research goals, it's time to rethink your metrics strategy.
2. You reap what you sow
Have you ever wondered why there seem to be so many research articles flooding the market, even if they don't add anything new or impactful to society? The answer is simple: money.
If your salary, bonus, perks, or project funding depends on the number of papers you produce, you'll end up churning out more papers, regardless of their actual significance. If every researcher follows this path, the number of truly impactful papers will inevitably decrease. Not only is innovation becoming more challenging, but the sheer number of irrelevant papers has skyrocketed.
To top it off, top scientists are highly sought after by academic institutions worldwide because they can significantly boost the institutions' global rankings in influential academic indices such as the Shanghai ranking (which factors in the number of research papers and citations). And the higher the ranking, the more funding opportunities and more students for those institutions.
Maybe one of the reasons that led to the suspension without pay of Rafael Luque, who is among the world's most cited researchers, could be related to his recent pace of work. As a chemist, Luque has consistently ranked on the list of most-cited researchers for five years in a row. However, in just the first three months of 2023, he has already authored 58 papers, indicating that he has been signing one every 37 hours. Nevertheless, I suspect that there is more than meets the eye in this case.
Perhaps another underlying issue is that researchers are too set in their ways. They expect to get the same results using the same methods, even though the world has changed drastically in the last 20 years. Companies and employees have adapted to these changes, so why shouldn't research institutions do the same?
It's time for the research community to take a long, hard look at itself and address the issues that need fixing. Instead of focusing solely on metrics and outputs, researchers should prioritize the quality and impact of their work.
After all, as Jack Welch once said...
"If the rate of change on the outside exceeds the rate of change on the inside, the end is near."
Innovation Is Not a One-Person Show
1. Innovation is a team effort
Innovation is not a one-person show; it usually takes a team effort to bring new ideas to life.
With the world becoming increasingly volatile, uncertain, complex, and ambiguous (VUCA), innovation requires individuals who can connect dots from various fields and disciplines. This means that more than ever, companies need individuals who possess T-shaped profiles - people who have deep knowledge in one area but can also work across different domains with agility and flexibility. These individuals are highly valuable in today's world, which makes them challenging to develop internally or recruit from other disciplines or institutions.
It's also important to remember that your company or university or research institution is not operating in a vacuum.
In today's interconnected world, isolation is not an option, and focusing solely on improving internal processes is not enough to stay competitive. Even if you are in a regulated industry, relying on regulation and lobbies can only buy you some time. The harsh reality is that if you do not embrace change, you will eventually be replaced by someone who will.
As Guy Kawasaki, the former Chief Evangelist for Apple, famously said...
"There are always two guys in a garage planning your disappearance. Either you go ahead of them, or they will achieve it."
2. The innovation engine doesn't belong to one country
It's time to face the facts: the era of Silicon Valley and its innovation monopoly is coming to an end. The constant pressure from shareholders and unrealistic expectations from upper management have led to massive layoffs in big technology companies, and the current economic climate is not helping.
But here's the good news: Innovation is not confined to the US. India and China are emerging as major players in the game, and they are quickly catching up to their Western counterparts. Thanks to the democratization of technology, anyone, anywhere can come up with a groundbreaking idea and fully disrupt established business models.
In fact, the democratization of technology is one of the most exciting things about this new global game. It means that the smartest people aren't necessarily working for the biggest companies; they could be working for themselves elsewhere, ready to shake up the industry with their brilliant ideas.
So, if you want to stay ahead of the game, it's time to broaden your horizons and look beyond the borders of your own country. Innovation is a global engine, and it's up to you to tap into it.
Jack of All Trades, Master of None
It is well known that the best ideas happen when you are focused. And it's always challenging to excel at many disciplines when you have too many things on your plate.
To be a good researcher you need the ability (among others):
- Analytical and critical thinking: Researchers need to be able to analyze and interpret data, identify patterns and trends, and draw conclusions based on evidence.
- Attention to detail: Research often involves meticulous data collection and analysis, and researchers need to be able to pay close attention to small details.
- Creativity: While research involves following a scientific method, researchers also need to be able to think creatively and come up with novel approaches to solving problems.
- Writing and communication: Researchers need to be able to communicate their findings effectively through writing and presentations, both to other researchers and to the general public.
- Collaboration: Research often involves working with other researchers, and researchers need to be able to collaborate effectively as part of a team.
But to be a good professor, you need a different set of skills:
- Expertise in their subject area: Have a deep understanding and mastery of the subject they are teaching, as well as the ability to stay up to date with new developments and research.
- Effective communication: Able to communicate complex ideas clearly and concisely. They should also be able to adapt their teaching style to meet the needs of different students.
- Passion for teaching: Have a passion for teaching and a genuine desire to help students learn and succeed.
- Flexibility: Be flexible and able to adapt to different teaching environments, student needs, and learning styles.
- Patience and empathy: Patience and empathy towards students. Understanding that everyone learns at their own pace and may need additional help. Students may come from different backgrounds and with varying levels of academic ability.
- Leadership and management: Professors may be called upon to lead teams of professors, manage administrative staff, or mentor students.
- Organizational skills: Have excellent organizational skills to manage multiple courses, assignments, and student records. Not to mention anything related to administrative duties.
- Creativity: Bring fresh perspectives and innovative ideas to their teaching, inspiring students and pushing the boundaries of their field.
But why did I bring professors to the table? Because most researchers must combine their investigations with teaching and mentoring students!
We may not fully comprehend the immense pressure we are placing on researchers. Balancing research and teaching requires a seamless transition between different tasks, and the ability to manage (un)expected interruptions to their research plans. While some may excel at both, it is highly unlikely. Perhaps it is time to recognize the need to split or balance these roles more effectively.
If you are still unsure about this, ask yourself: Do you recall your favorite professor during your university days? Was he or she an exceptional researcher too? What about a person you look up to in a particular field? Did they possess the patience to guide you through their research?
Conclusion: Change And Adapt
There are clear indications that we must break free from an outdated research model that is not bearing fruit and strive for more impactful innovation in society. To improve the current research model, we must first speak the same language and identify what's working and what needs to be fixed.
Even though this is a challenging and complex issue, we cannot address it without self-criticism and a strong desire to improve. While a drastic shift in the research methods may not be immediately feasible, that's no reason to dismiss alternative viewpoints or be open to new perspectives. The writer Ernest Hemingway said that "the secret of wisdom, power, and knowledge is humility". The philosopher Socrates also left us another pearl: " Think not those faithful who praise all thy words and actions, but those who kindly reprove thy faults”.
Instead of engaging in irrational or emotional arguments, let's shift our focus toward finding meaningful approaches to the challenges faced by the research industry. Let’s build bridges between research and innovation disciplines, between researchers and entrepreneurs, between thinkers and doers. Both roles are needed for the progress of humankind.
Make no mistake, the stakes are high. Because if we don't learn, adapt, and change, humanity will suffer.
"The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn" - Alvin Toffler
Article originally published on Transformation Architect
#ScienceAndTechnology #InnovationShift #ResearchChallenges #ScientificProgress #NatureStudy #AcademicMetrics #ResearchQuality #InterdisciplinaryWork #AdaptAndChange #ScienceInnovation #ResearchInnovation
Immerse yourself in the game-changing ideas of OpenExO.
Begin your journey with an 🎟️ExOPass and read the book 📚Exponential Organizations 2.0