How to conduct ethnographic research: a practical guide with AI prompts
Ethnography is a qualitative research method where the researcher observes and interacts with users in their natural environment over an extended period — their homes, workplaces, commutes, or anywhere they actually encounter the problem you are studying. Unlike lab-based methods, ethnography captures behavior in context: the physical surroundings, social dynamics, tools, workarounds, and unspoken routines that users themselves may not think to mention in an interview.
What question does it answer?
- How do users actually behave in their real environment (as opposed to what they report in interviews)?
- What environmental, social, and cultural factors shape how users interact with products or services?
- What workarounds, improvised tools, and unspoken habits have users developed?
- Where do breakdowns happen in the user’s natural workflow that they have stopped noticing?
- What unmet needs exist that users cannot articulate because they have adapted to the current situation?
When to use
- At the very start of a project, when the team has little first-hand knowledge of the user’s world and needs to build genuine empathy before generating ideas.
- When interviews and surveys have stopped producing new insights because users cannot articulate what they do — they just do it.
- When you suspect a gap between what users report and what actually happens: stated behavior versus observed behavior.
- When designing for complex, multi-step workflows that involve physical spaces, multiple tools, or other people (healthcare, logistics, retail, manufacturing).
- When entering a new market or cultural context where assumptions from the home market may not hold.
- When prior research suggests environmental factors are influencing usability, but you do not yet know how.
Not the right method when you need quick feedback on a specific design decision (use usability testing), when the question can be answered with analytics or a survey, when you cannot access users in their real context, or when the timeline is too short for any field observation — even a condensed study needs at least 2-3 days in the field.
What you get (deliverables)
- Field notes: timestamped observations with rich contextual detail (what happened, who was involved, what tools were used, what the environment looked like)
- Photo and video documentation: images and clips that capture the environment, artifacts, and behaviors
- User journey maps grounded in observed (not self-reported) behavior
- Contextual personas: personas built from real field data rather than demographic assumptions
- Affinity diagram of observed themes, patterns, and anomalies
- Insight report: key findings structured as observation → implication → design opportunity
- Artifact inventory: photos and descriptions of tools, workarounds, notes, and physical objects users rely on
Participants and duration
- Participants: 5-12 users. In ethnography, depth matters more than breadth. A single well-observed participant can reveal more than fifty survey responses. If studying distinct user segments, aim for 3-5 per segment.
- Session length: Each field visit lasts 2-4 hours. Some studies use full-day shadowing (6-8 hours). Traditional ethnography spans weeks, but design ethnography — adapted for product teams — condenses observation into focused visits.
- Total duration: 2-6 weeks depending on scope. Planning and recruitment: 3-7 days. Field visits: 5-10 days (1-2 visits per day; visiting the same participant twice on different days is valuable). Analysis and synthesis: 3-5 days.
How to conduct ethnographic field studies
1. Define the research scope and questions
Write down what you need to learn and what decisions the research will inform. Ethnography can easily become unfocused if you enter the field without clear questions. Frame your scope as a domain to explore rather than a hypothesis to test. Example: “Understand how warehouse workers manage inventory throughout a shift” rather than “Prove that our app saves time.”
2. Identify and recruit participants
Find participants who represent the people you are designing for and who you can observe in their real environment. Screening should focus on behavioral criteria: frequency of the activity, role, and willingness to be observed at home or work. Recruit through customer databases, partner organizations, or field recruiting (approaching people at the location itself). Offer incentives that match the time commitment — $100-250 for a 2-4 hour visit, more for full-day shadowing.
3. Choose your observation approach
Field studies span a continuum from pure observation (the researcher watches silently) to participatory immersion (the researcher tries to do the task alongside the user). Decide where your study falls:
- Fly on the wall: Minimal interaction. You observe, photograph, and take notes without interrupting the user’s natural flow. Best when you want to capture unfiltered behavior.
- Guided tour: The user walks you through their environment and routine, narrating as they go. A balance between observation and interview.
- Apprenticeship model: You ask the user to teach you how to do their task. This surfaces tacit knowledge — the things people know but cannot explain without demonstrating.
- Shadowing: You follow the user through their day (or part of it), observing all activities, transitions, and interruptions. Best for understanding end-to-end workflows.
Most design ethnography studies use a blend of guided tour and shadowing, combined with short contextual interviews during natural breaks.
4. Prepare your field kit
Bring everything you need so you can focus on observing rather than troubleshooting logistics: notebook and pen (always have an analog backup), camera or smartphone for photos and short video clips, audio recorder or recording app for contextual interviews, consent forms (printed), a short topic guide with 5-8 open-ended questions for contextual interviews during breaks, and a small gift or incentive for participants.
5. Conduct field visits
Arrive early. Spend the first few minutes building rapport — introduce yourself, explain what you will be doing, obtain consent, and reassure the participant that you are studying the situation, not judging them. Then step back and observe.
During the visit:
- Record observations in real time or immediately after each segment. Note what the user does, who they interact with, what tools they reach for, what they avoid, and what the environment looks like.
- Photograph artifacts: Post-it notes on monitors, printouts taped to walls, improvised tools, worn paths on floors — these physical traces reveal habits that users will not mention.
- Ask brief clarifying questions when there is a natural pause: “I noticed you switched to the paper form — what made you do that?” Keep questions grounded in what you just witnessed.
- Note your own emotional reactions and assumptions — these become useful during analysis as a check against bias.
6. Debrief after each visit
Within 30 minutes of leaving the field, write a structured debrief: top 3 observations that surprised you, behaviors that contradicted your expectations, patterns you are starting to see across visits, specific moments worth revisiting in analysis, and new questions to explore in future visits. If working in a team, hold a short debrief conversation (15-20 minutes) to compare notes while the experience is fresh.
7. Organize and code your data
Collect all field notes, photos, video clips, and audio recordings in one place (Dovetail, Miro, or even a shared folder with a consistent naming convention). Then code the data: tag observations by theme, mark moments of frustration, delight, confusion, or adaptation, and identify artifacts and map them to the behaviors they support.
8. Synthesize into insights
Build an affinity diagram: group coded observations into clusters, then name each cluster as a theme. Elevate themes into insights by adding implications:
“Warehouse workers check the paper clipboard before checking the app because the clipboard is mounted at eye level at the station entry, while the app requires unlocking and navigating two screens. The physical environment gives the paper tool an advantage the digital tool has not earned.”
Look for recurring patterns across participants, contradictions between what users said and what they did, environmental factors that enable or block desired behaviors, and workarounds that signal unmet needs.
9. Share findings with the team
Create a visual, evidence-rich presentation. Lead with insights and design opportunities, not a methods description. Include photos and short video clips (with consent) to bring the field to life for stakeholders who were not there, journey maps based on observed sequences, “before and after” framing (what we assumed versus what we found), and prioritized design opportunities tied to specific observations.
How AI changes this method
AI compatibility: partial — AI assists with preparation (observation guides, recruitment materials) and analysis (field note coding, thematic synthesis, artifact cataloging) but cannot replace the researcher’s immersive presence in the field. Ethnography depends on prolonged observation, relationship-building with participants, and interpreting cultural and environmental context — activities that are irreducibly human.
What AI can do
- Field note processing at scale: Ethnographic studies generate large volumes of unstructured data — handwritten notes, audio clips, hundreds of photos. AI transcription tools convert audio to text instantly, and NLP-based tools (Dovetail, Atlas.ti) can tag and cluster field notes by theme across multiple visits.
- Photo and artifact cataloging: Multimodal AI models can describe and tag photographs of workspaces, tools, and environments, creating a searchable visual database instead of manually labeling hundreds of field photos.
- Thematic analysis across visits: After coding field notes, an LLM can identify patterns across 5-12 participants that would take days of manual comparison — recurring workarounds, environmental constraints, and behavioral patterns.
- Observation guide and consent form drafting: AI generates customized observation checklists, contextual interview questions, and consent forms based on the study’s domain and goals, saving several hours of preparation.
- Debrief synthesis: After each field visit, the researcher writes brief bullet-point notes. AI can expand these into structured debriefs and compare them across visits, tracking how emerging themes evolve over time.
- Cross-cultural research support: Real-time translation tools help researchers working in unfamiliar linguistic contexts. AI can also flag culturally specific expressions in transcribed notes that require deeper interpretation.
What requires a human researcher
- Physical presence and immersion: Ethnography requires being in the environment — seeing, hearing, and feeling what the participant experiences. No AI can sit in a warehouse during a night shift or ride along on a delivery route.
- Building trust and rapport over time: Participants open up to researchers they trust. This trust develops through repeated visits, genuine interest, and human connection.
- Interpreting cultural and social dynamics: Why a team member defers to a colleague before acting, why a certain tool is placed in a specific location, why a routine changes on Fridays — these observations require cultural literacy and contextual judgment.
- Ethical navigation in the field: Deciding when to stop observing, when a participant is uncomfortable, when to intervene versus remain a passive observer — these are real-time ethical decisions that cannot be automated.
- Separating signal from noise in context: A field environment is full of stimuli. The researcher decides what matters — which behaviors to track, which artifacts to photograph, which moments to probe.
AI-enhanced workflow
AI’s most significant impact on ethnographic research is post-fieldwork. A researcher who spent a week conducting field visits used to spend another week organizing, coding, and synthesizing the data. With AI tools, the coding and initial synthesis can be compressed to 2-3 days. Transcription is automated, photo cataloging is assisted, and pattern detection across visits happens in minutes rather than days.
During fieldwork itself, AI plays a supporting role: recording audio, providing real-time transcription of contextual interviews, and generating end-of-day debrief templates. The core activity — observing, being present, noticing what matters — remains entirely human.
Where AI notably does not help is the interpretive leap that makes ethnography valuable: connecting a physical observation (“the clipboard is mounted at eye level”) to a design insight (“the physical environment gives analog tools an advantage the digital tool hasn’t earned”). This interpretive work — seeing the implications of what was observed — is what distinguishes ethnographic research from mere observation, and it remains a human skill.
Tools
- Documentation: Smartphone camera (always available), GoPro or action camera (hands-free recording during shadowing), portable audio recorder, field notebook
- Consent and logistics: DocuSign or printed consent forms, Calendly (scheduling), Google Maps (field visit planning)
- Analysis and synthesis: Dovetail (coding, tagging, repository), Miro (affinity diagrams, journey mapping), EnjoyHQ, Notably
- Transcription: Otter.ai, Rev (for contextual interview recordings)
- Photo and video management: Google Photos (auto-tagged, searchable), Lookback (for recorded sessions with annotations)
- AI-assisted analysis: Speak AI (theme detection from field notes), Atlas.ti (qualitative coding with AI support), Dovetail AI (auto-tagging and clustering)
- Remote ethnography (when in-person is not possible): Dscout (mobile ethnography with participant self-documentation), Indeemo (video diary + tasks), EthOS (longitudinal participant engagement)
Works well with
- In-depth Interview: Ethnography shows what users do; follow-up interviews explain why. Running interviews after field visits lets you ask about specific observed behaviors rather than relying on users to recall them.
- Diary Study: A diary study captures behavior over days or weeks that a single field visit cannot cover. Combining the two gives you both researcher-observed depth and participant-reported breadth over time.
- Journey Mapping: Ethnographic observations provide real, witnessed touchpoints and transitions for journey maps — far more reliable than asking users to reconstruct their journey from memory.
- Persona Building: Field data makes personas evidence-based rather than assumption-based. Observed behaviors, environmental constraints, and workarounds become the foundation for persona attributes.
- Contextual Inquiry: Contextual inquiry is a more focused, shorter-form version of ethnographic observation centered on a specific task. Use ethnography first to understand the broad context, then contextual inquiry to drill into specific interactions within that context.
Example from practice
A hospital was redesigning its electronic health record (EHR) system. Surveys and interviews with nurses had identified “too many clicks” as the primary complaint. The product team planned to reduce the number of screens in the charting workflow.
Before committing to that direction, a researcher spent three days shadowing nurses across two units during day and night shifts. The field visits revealed something the surveys had missed entirely: nurses were not primarily frustrated by the number of clicks. They were frustrated because the EHR required them to sit at a fixed workstation, while their actual workflow was constantly mobile — moving between patient rooms, medication carts, and colleagues. Each time they returned to the workstation, they had to re-authenticate and navigate back to where they had left off. The “too many clicks” complaint was actually a symptom of context-switching caused by a mismatch between the system’s stationary design and the nurses’ mobile work pattern.
Based on these observations, the redesign shifted focus from screen reduction to mobile access and session persistence. The team introduced a tablet-based interface that nurses could carry between rooms, with automatic session resumption. Six months after deployment, charting time per patient decreased by 22%, and nurse satisfaction scores for the EHR rose from 3.1 to 4.2 on a 5-point scale.
Beginner mistakes
1. Entering the field with a solution already in mind
When a researcher arrives at the field site with a preconceived idea of what the problem is and what the product should look like, every observation gets filtered through that lens. Confirmatory evidence gets recorded while contradictory evidence gets dismissed as an outlier. The antidote is to write down your assumptions before the first visit and deliberately look for evidence that disproves them. If your field notes only confirm what you already believed, that is a warning sign, not validation.
2. Asking users what they want instead of observing what they do
Novice researchers default to interview mode — sitting down with the participant and asking questions. In ethnography, observation is the primary tool. What people do reveals needs they cannot articulate. If a warehouse worker says the scanning system works fine but you watch them shake the scanner three times before each successful scan, the observation is the data, not the verbal claim. Reserve questions for clarifying what you have already witnessed.
3. Spending too little time in the field
A single two-hour visit is not ethnography — it is a site tour. Patterns only emerge across multiple visits and multiple participants. Returning to the same participant on a different day often reveals that the first visit captured their “performance” behavior (showing you how things work) while subsequent visits capture their actual behavior (including shortcuts, frustrations, and workarounds they edited out initially).
4. Poor documentation that relies on memory
Field notes written hours later lose the specific details that make ethnographic data valuable. The exact sequence of actions, the physical position of tools, the tone of voice during a frustration moment — these details fade fast. Write notes during or immediately after each observation segment. Photograph everything. Use audio recording when contextual interviews happen spontaneously.
5. Failing to separate observations from interpretations
Raw field notes should distinguish between what you saw (“The nurse walked to the workstation, typed her password, waited 8 seconds for the screen to load, then navigated three menus to reach the patient chart”) and what you inferred (“The nurse was frustrated by the slow system”). Mixing the two makes analysis unreliable because your interpretations become indistinguishable from evidence. Use a two-column format: left column for observations, right column for your interpretations and questions.