The subjects of research and creativity are huge. I will describe some of the relevant steps below. However, there are two questions I always get from my friends and students: how to know whom to believe and how to express complex ideas as mathematical formulas. There is also a complementary question of defining the subject area and finding the most imaginative and innovative solution within the subject area.
Once we collected some knowledge, we get a new article. How does it relate to the knowledge we have? What kind of innovation should we add? Can we generate some new understandings or actionable steps based on the new information?
It is fine to trust others when we cannot do our own analysis, but when the experts disagree we must form our own opinion.
There is no clear handling process, but there are some principles. Once I get into the details, they are really simple and intuitive.
- Separate facts, theories, methodology, motivation, and other metadata. “Color” the data and treat it accordingly.
- The theory should be logically sound, hence it can be analyzed for the correctness of formal logic and inference analysis.
- Some works contain ontologies and pseudomathematical formulas. These tools can be directly translated into graphs, flowcharts, and logical expressions – so-called logical markers.
- The methodology can be divided into procedural knowledge that is usually learned by hands-on practice, and the “experimental” setup that can be analyzed for built-in bias.
- The supporting facts and numbers can be cross-referenced with other resources and validated using some mental math.
- “Motivation” factors can be used for extrapolating the framework of knowledge, for example, by applying similar tools to other similar issues or as a way to address other paradigms (e.g. bursting the filter bubble).
- What to do with the new knowledge is not a part of the article. One can use imagination and daydream about a possible future or one can use formal creativity methods and try to resolve existing issues.
- Run experiments, real or simulated to make sure that the knowledge is not only theoretically plausible but also practically sound.
The process can be gray and boring, but it also can be very cool and exciting. Every step of the process can be easily gamified.
I have a course about the subject
Divide and conquer
Scientific articles have a very clear structure: they separate background, innovation, experimental methodology, theoretical proofs, results, and conclusions. Patents have similar well-formulated partitions.
Other resources are not so well-defined. Non-fiction articles and especially webcasts are usually convoluted. Everything is mixed with everything else so that the writing is easier for the author and the experience is more engaging for the reader. Copywrited materials are even more convoluted to create purchasing impulses. Even scientific articles can be convoluted – unless they passed peer review. The reviewers usually do not check the validity of claims – unless the claims are preposterous, they do check the format of the presentation.
When we want to analyze what we just read it is best to color-code the content according to facts and numbers (black on white), innovation (green), methodology (blue), motivation (red), and actionable items (yellow highlight). The colors come from the 6 thinking hats paradigm.
Each part needs to be analyzed separately, with a very different set of tools.
Deductive, Induction, Abduction
The theory should be logically sound. Analyzing logical correctness is a bit strict. We start by checking that the assumptions are reasonable. Then there are several mathematical ideas, which can be used in any scenario.
Deduction uses the language of formal logic, which is similar to boolean logic programmers know very well. The logic should be formally correct and cover all important scenarios. Since the authors rarely use formal logic, their arguments need adaptation to some pseudologic. Due to Gödel’s theorem, we a priori know that no logic will be complete. So the main question is: what exactly did we miss?
Inference uses the language of probabilities. There are probabilities of events and conditional probabilities of events given some other events. Then using the language of statistics the likelihood of the conclusion is evaluated. It feels like playing poker, especially when we know that many authors are bluffing.
Abductive thinking is somewhat farfetched. There is a qualitative analysis of some use cases, from which the author generalizes common ideas. Typically such analysis is backed up by more quantitative results to form something reasonably meaningful. Typically quantitative and qualitative analysis address different attributes, and we need to determine how close these factors are.
Occasionally theories are presented as self-explanatory and we are trusted to look online for supporting or contradicting information.
If you want to see all of these factors in one subject area, read about the health risks and benefits of coffee.
Flowcharts and logical markers
Some works contain built-in flowcharts. For example
- Ontologies/taxonomies (hierarchy). These are tree structures like mindmaps.
- Feedback loops. These are classical flowcharts. Positive (vicious cycle) and negative (control) feedback loops are abundant.
- Contributing factors. Can be denoted either as mathematical sets or as pseudo-arithmetics.
- Diagrams are usually analyzed and memorized as-is. Diagrams are usually tools supporting intuition, not something mathematically valid.
All of these graphical tools are essentially logical markers. Add to this existence (∃) as a form of deduction, causality (=>) as a form of inference, and equivalence (~=) which is usually contextual. This provides a good range of logical markers for everyday use.
“Experimental” setup contamination
In actual scientific research, scientists set up experiments to eliminate any factors that can contribute to the wrong conclusion. In real life, the “experimental” setup is seriously contaminated. Any result can be explained by multiple factors.
For example, historians do not have sufficient knowledge to do actual experiments and try to do the next best thing. They may generate complex and farfetched ideas based on very few actual findings, backing them up by plausible storytelling. While these stories are cool, they are not really scientific.
Business leaders are evaluated based on their contribution to the company’s performance. The performance of any company can be attributed to many factors. Simply increasing the risk level may produce both stellar success and huge failure. So the best and the worst leaders often use similar tactics.
Motivational books are typically biased. All political literature is biased. Try to burst the filter bubble by considering the most plausible scenario. At the same time, notice, that as we learn more we get very different plausibility estimates.
If you see something actionable, there is nothing better than actually doing it hands-on. Hands-on activity typically provides a lot of insight into the factors and limitations not mentioned by the authors. Procedural learning is very different from declarative learning. It is a complementary activity. And it can be optimized using a different toolset. For example, visualizing performing the activity boosts the performance.
We have a separate course on hacking procedural skills
In the course, I provide some resources for doing basic mental math. You do not need to be accurate – just see that the numbers make sense. It is easy to put the decimal point in the wrong place, and it is easy to find such errors. Also, you may want to convert units and do some other simple engineering tricks.
I really do not encourage long accuracy after the decimal point. You can use calculators for that. However, you should have some basic numerical awareness to help with analysis. I have a friend who negotiates government contracts. He told me that he uses a lot of mental math to understand how 0.5% addition to teachers’ salaries may affect the entire budget structure. This is something everyone can be able to do with some practice.
We can do the job of search engines: cross-referencing, looking for rare keywords, and comparing citations. All authors try to optimize these factors. If something is not optimized we are not likely to see it.
We may need to look for the initial mention or supporting numbers. For example, for years there was a claim that Israeli prime minister Benjamin Netanyahu had an IQ of 185. This claim was echoing one article from the “businessinsider” blog that was not really substantiated. Businessinsider removed the article and posted an apology, but it took time for search engines to purge the reference.
Searching with the keyword “controversy” is a cool way to find the opposite opinion. Then there is a question of who is more trustworthy.
There are many scientists that claim speedreading to be impossible, yet I can do it and I know many others who can. Most of my students can speedread. I simply do not have the motivation to argue with psychologists: my PhD is in engineering.
Psychology is especially prone to pseudoscience as many fathers of psychology used qualitative rather than quantitative approaches. As a rule of thumb, if a psychologist wants to talk about your mother, you should run. If you are asked about triggers and situations, stay. When you are taught hypnotic techniques listen very carefully – even though they sound marginally crazy.
Motivation and intuition
The authors have good reasons to do what they do. Typically they disclose their research motivation. This is not like warm and reassuringly emotional “you can do it”, but very cold “I needed that… so I added/removed…” or “I remember that… , so I thought I might use….”.
Intuition is often backed up by diagrams, acronyms, and other mnemonic devices. Also, it quite often borrows from TRIZ creativity tools arsenal but uses a different language. You may also see a lot of stories and case studies as a basis for intuition. This is where the qualitative analysis shines.
We can further apply research motivation and intuition from one situation to a very different situation. Formal creativity tools provide a very comfortable platform for this activity.
Applications and imagination
What to do with the new knowledge is not a part of the articles we usually read. One can use imagination and daydream about a possible future. Quite often from imagining the end results, we can interpolate the necessary milestones.
Pomodoro breaks are critical for productivity. Daydreaming is a serious element of Pomodoro breaks. If I get excited I definitely allow myself to daydream. Understanding the trends and imagining a possible future enable better decision-making.
The biggest milestone before the integration of the new knowledge in our toolset is simulation. Actual experiments and training might be hard and expensive. Mind experiments and visualization cost nothing. If they were good enough for Einstein, they might also be sufficient for your needs.
Analysis by synthesis is an underrated research tool. To show something, try to build an example. A single example may invalidate long research and complex analysis.
If you want to know who is telling the truth or use logical markers, I seriously suggest taking the course
It is a part of a larger bundle:
Research, analysis, and creativity masterclass:
A bundle with these courses
The time-limited 50% coupon for these three products speedwritng_research_50
If you already paid for memory/speedreading and want to upgrade, please contact me [email protected] for a special deal.