Login or signup to connect with paper authors and to register for specific Author Connect sessions (if available).

Chat Your Data: Prompt Engineering for Standardized GenAI Results
Christian Micus, Ani Dekova, Timo Böttcher, Helmut Krcmar
As generative AI (GenAI) adoption accelerates, industries face the challenge of integrating these technologies into their workflows. This study addresses the gap in prompt engineering frameworks needed to enhance LLM accuracy in industrial applications. We investigate a multifaceted approach incorporating contextual cues, directive language, and structured outputs, drawing on human communication paradigms. Our research study introduces prompt tuning and chain-of-thought prompting to reduce user input and clarify AI decision-making. Our study findings indicate that these methods can streamline complex processes, such as employee onboarding, paving the way for wider GenAI adoption. The study advocates for customizing GenAI to specific industrial tasks and calls for increased model transparency to bridge data gaps and mitigate "black box" issues. The contribution demonstrates prompt engineering's role in improving operational efficiency and precision, promoting GenAI's broader implementation in business settings.

AuthorConnect Sessions

No sessions scheduled yet