⚡ I analysed notes from 50 research calls using chatGPT and NotionAI. This is what I learned about AI-supported research.
Since spring 2022 we at Reruption have been doing calls with experts from within corporate innovation functions. The goal was to create a whitepaper on the topic of “Challenges in Corporate Innovation in 2023”.
Meanwhile we launched the results as a “dynamic whitepaper” on our website (https://lnkd.in/e6mNxHaK ).
To create quantitative results from the qualitative input that our expert conversation partners provided I used chatGPT and NotionAI. That more than 50x’ed (no bs!!) the speed of the analysis, but it’s not plug and play:
𝟭. 𝗬𝗼𝘂 𝗵𝗮𝘃𝗲 𝘁𝗼 𝗸𝗻𝗼𝘄 𝘁𝗵𝗲 𝗺𝗲𝘁𝗵𝗼𝗱: Don’t know what deductive reasoning is? AI won’t be helpful.
𝟮. 𝗬𝗼𝘂 𝗵𝗮𝘃𝗲 𝘁𝗼 𝗱𝗼 𝘁𝗵𝗲 𝗰𝗼𝗻𝗰𝗲𝗽𝘁𝘂𝗮𝗹 𝘄𝗼𝗿𝗸.
𝟯. 𝗧𝗵𝗲𝗿𝗲 𝗶𝘀 𝗻𝗼 𝘀𝗵𝗼𝗿𝘁𝗰𝘂𝘁: There is no “analyse this pile of data and put it into form xyz”. You have to break down the work packages into the single steps. AI can’t (yet) do multistep tasks by itself.
𝟰. 𝗔𝗺𝗼𝘂𝗻𝘁 𝗼𝗳 𝗱𝗮𝘁𝗮 𝗶𝘀 (𝘀𝘁𝗶𝗹𝗹) 𝗮 𝗽𝗿𝗼𝗯𝗹𝗲𝗺: if you input long notes, chatGPT can’t yet handle it. Therefore you have to break down notes into multiple parts before getting it analysed.
𝟱. 𝗟𝗮𝗻𝗴𝘂𝗮𝗴𝗲𝘀 𝘄𝗼𝗻’𝘁 𝗺𝗮𝘁𝘁𝗲𝗿 𝗶𝗻 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲: input notes in German, prompts in English, output in English. Doesn’t matter for the AI. Therefore in the future all kinds of languages will become less important. You just work, think and type in what suits you best and the AI handles the rest. This will also become true for programming languages: you write in natural language what you want - the computer handles the rest.
What are your AI learnings so far?
Comments