Prompt Engineering By INDOMITABLE LIMITEDOctober 14, 2025October 24, 2025 Welcome to your Prompt Engineering Prompt Engineering Which AI model made prompt engineering popular? ChatGPT Google BERT TensorFlow PyTorch None Prompt Engineering What is a “prompt”? A programming function A database A command or instruction given to an AI model A type of chatbot None Prompt Engineering Which of the following is NOT a prompt Zero-shot prompt Compiler prompt Image prompt Instruction prompt None Prompt Engineering What is “zero-shot prompting”? Prompting without examples Prompting with one example Training data prompts Using multiple examples None Prompt Engineering What does “few-shot prompting” mean? Giving random inputs Providing large training data Providing a few examples to guide the model Providing one example None Prompt Engineering Which of these tools is commonly used for generating images from prompts? GitHub Notepad++ DALL·E TensorFlow None Prompt Engineering Which of these is a best practice in prompt writing? Use clear, specific instructions Be vague and short Write in code format only Avoid examples None Prompt Engineering What is “Chain of Thought” prompting? A series of reasoning steps given in a prompt A type of chatbot conversation Debugging technique Neural network chain None Prompt Engineering What is the benefit of providing context in prompts? Reduces response time Helps the model understand intent and produce accurate output Makes the model run faster Increases token limit None Prompt Engineering What is a “system prompt” in ChatGPT? A user’s direct question A background instruction defining model behavior Output format Debugging log None Prompt Engineering What is tokenization in LLMs? Deleting unnecessary data Breaking text into smaller pieces for processing Encrypting the data Combining text blocks None Prompt Engineering Which OpenAI model is known for image understanding and generation? Whisper DALL·E Codex Embeddings None Prompt Engineering Which prompt gives better results? Clear, detailed, and goal-oriented Random keywords Ambiguous and short Code snippets only None Prompt Engineering What does “temperature” control in AI generation? Creativity or randomness in responses Speed of processing Model memory File size None Prompt Engineering Which of the following prompt styles improves factual accuracy? Role-based prompting (“You are a data scientist…”) Open-ended vague prompting Random phrasing Emotional prompting None Prompt Engineering What is the ideal way to refine a prompt? Experiment and iterate based on model response Change models Use longer sentences only Avoid testing None Prompt Engineering What is prompt chaining? Using multiple related prompts to build complex tasks Combining two LLMs Creating multiple accounts Debugging a model None Prompt Engineering What is “context window” in LLMs? The time the model runs The number of tokens the model can remember in one interaction The model’s dataset The length of the output None Prompt Engineering What is a “negative prompt” (in image generation)? A prompt with negative numbers Prompt that asks model to avoid certain features A prompt that deletes images A random prompt None Prompt Engineering Which of the following improves model reliability in prompt engineering? Copying prompts from others Using fewer words Using random emojis Using role + task + context + format structure None Prompt Engineering What does “few-shot learning” enable LLMs to do? Learn new tasks from few examples Generate images only Train on large datasets Forget old data None Prompt Engineering Which of these helps prevent hallucinations in AI output? Provide source references or constraints in the prompt Use random data Use higher temperature Use shorter prompts None Prompt Engineering What is the main purpose of prompt engineering in generative AI? To improve the quality and relevance of AI outputs To run hardware optimization To train the model To build datasets None Prompt Engineering What does “multi-modal prompting” refer to? Using multiple CPUs Using text + image + audio inputs together Running two models Using multi-language prompts None Prompt Engineering What is the main advantage of iterative prompting? Avoiding context Refining outputs through step-by-step improvements Using more tokens Running model faster None