Deploying Spotter

To fine-tune Spotter’s understanding of your data, you will want to coach it to recognize your organization’s business terms and common use cases before users begin searching. This article documents the order in which you’ll want to prepare Spotter for use by your organization.

Step 1: Identify who you’re coaching Spotter for

Start by thinking about your target personas (the types of users) and the kinds of questions they typically ask. Spotter is most effective when its coaching is based on real user needs.

Table 1. Example persona table
Team Role Use cases

Sales

Sales Ops, Regional heads

Pipeline metrics, conversion, regional performance

Marketing

Campaign managers

Spend ROI, campaign attribution, top-performing channels

Customer success

CSMs

Churn risk, product usage patterns, account health scores

Focus on one team or user group at a time, preferably one that has urgent data needs and could see high value from using Spotter.

Step 2: Collect real questions and group them

Gather the actual questions your target business users are asking. This step is crucial because analysts, who are typically more familiar with the dataset’s structure and terminology, may ask questions very differently than business users.

Collecting real questions directly from business users helps you understand their natural language, the specific terms they use, and the context behind their queries. This collection will form your Spotter Coaching Scope. Discovering how your business users actually phrase their questions early on will help you develop a clear understanding of how much coaching Spotter will require and precisely what aspects you need to focus your coaching efforts on.

Table 2. Example (Sales persona)
Group Sample questions

Pipeline metrics

“How much pipeline do we have for this quarter?”

Conversion funnel

“How many deals were won last month?”

Team comparisons

“Which sales rep has the highest win rate?”

If you’re unsure how to begin categorizing these questions or want a structured way to organize them for different use cases, you can use this Spotter coaching scope template to get started.

Step 3: Optimize data model

Once you have a clear understanding of your target users and the real questions they ask, the next crucial step is to ensure your data model is optimized to answer these questions effectively. This might involve creating a new data model specifically for this use case, or optimizing an existing one for which you have already created Liveboards and Answers.

Associated content such as Liveboards on a Model help in “warming up” Spotter, as it assists in column association and helps it pick columns more accurately for topics on which content has been created. For more information, see Spotter Model readiness.

Step 4: Test Spotter

After you’ve created or optimized your data model, perform an initial round of testing before diving deep into creating specific Spotter coaching like reference questions.

The purpose of this testing phase is to take a baseline assessment of how well Spotter can answer your collected questions with just the optimized data model, and to identify gaps or questions that Spotter struggles with or gets wrong. This helps determine if the issue lies with the data model itself (as in, you might need to tweak column names and synonyms as part of optimizing the data model), or if you need to add explicit Spotter coaching.

To test how Spotter responds to your Model, take a representative sample of the real questions you collected, and ask these questions in Spotter against the data model. Review the answers generated by Spotter, and note which ones are accurate, which are close, and which are entirely off. This review informs whether you need to revisit data model optimization or if you’re ready to proceed to targeted Spotter coaching for the problematic queries.

Step 5: Add coaching

Reference questions

After optimizing your data model and performing initial tests, reference questions become a primary tool for direct Spotter coaching.

Reference questions teach Spotter how to accurately interpret specific common questions, ensuring business users get verified and consistent answers. They are essentially sample questions with their corresponding correct answers in ThoughtSpot search keyword language.

Reference questions are curated examples where you can define a commonly-asked natural language question (such as “what are sales for east this month?”) and pair it with a verified query response using correct filters, fields, and formatting.

When business users ask the same or similar questions, Spotter uses these reference examples to generate the right response.

For more information on the process of coaching, see Spotter reference questions and Context in Spotter.

Business terms

Business terms are definitions you create to map your company’s everyday language (vocabulary used in day-to-day operations) to precise data logic (filters or formulas). This helps Spotter understand and accurately respond to phrases that might be vague, ambiguous, or specific to your business, which general LLMs wouldn’t grasp correctly. Every business often has its own definition for various metrics and specific ways of calculating them.

Business terms are powerful for:

  • Defining metric calculations: Teach Spotter exactly how your business calculates specific metrics (for example, 'Adjusted Gross Profit' = Gross Profit - Marketing Spend). Spotter can learn how you calculate these and extrapolate to different scenarios.

  • Creating synonyms and acronyms: Map common internal shorthand or alternative names to specific values or columns (for example, ‘N.Am’ as country = ‘North America’).

  • Applying specific filters: Ensure certain terms always include necessary filters (for example, ‘active users’ always filters out users who have not logged in yet).

For more information on the process of coaching, see Spotter business terms.

Step 6: Retest, validate, and save

Once your reference questions and business terms are in place, you will validate it with comprehensive test queries, test with an internal pilot group for feedback, and refine and finalize your coaching.

Validate with comprehensive test queries

  1. Run various sample questions (a set of test queries defined during step 2 and step 4) to check if Spotter applies the logic correctly.

    A sample question in the Spotter search bar reading “show me the win rate in the last quarter for David”
  2. Test if the answer is being generated as per your expectation. For instance, if you coached “win rate” and ask for “win rate this quarter”, check if Spotter correctly applies the formula and adapts to “this quarter”.

    A “review AI-generated Answer” screen

    Here, you can see that the three formulas defined for win rate are automatically getting generated and updated to leverage “this quarter” instead of last quarter with the correct date column for each.

    You can now be more assured of Spotter applying this logic anywhere “win rate” is used, even in follow-up questions or comparisons.

  3. Update and fine-tune your decisions based on these tests. If you are testing terms internally and don’t want to impact other user queries, keep it on user level. If you are ready for this to impact all queries on the data model, keep it on global level.

    Change business term access modal appears

For more information on user and group level coaching, see Understand coaching levels.

Test with an internal pilot group for feedback

  1. Engage a select group of internal users (ideally some of the target personas identified earlier or power users familiar with the data) to conduct pilot testing. This group is focused on validating the coaching and Spotter’s performance before it’s finalized.

  2. Provide them with access to Spotter with the new coaching.

  3. Encourage them to ask their typical business questions.

  4. Review the types of questions they ask from the conversation log in the Spotter Conversations Liveboard.

  5. Gather the users’ feedback on the accuracy and relevance of answers, ease of getting the information they need, and any confusing responses or unexpected behavior.

This pilot feedback is invaluable for uncovering blind spots or areas where the coaching might need refinement from a user perspective.

If your selected users do not have Model editing access, their feedback remains on user level, affecting only their queries.

Refine and finalize coaching

Based on the results from your test queries and the feedback from the internal pilot group, iterate on your coaching.

This may involve:

  • Adjusting reference questions.

  • Modifying business term definitions.

  • Adding new coaching for unanticipated scenarios.

  • Potentially revisiting the data model if fundamental issues are identified.

Continue this test-and-refine cycle until you achieve a satisfactory level of accuracy and usability.

Step 7: Prepare for launch

Once your coaching has been thoroughly tested and validated, the next crucial phase is to prepare for a smooth and effective rollout of the coached Spotter experience to your broader end-user community. Proactive planning for user adoption, coaching, and support is key to a successful launch.

Leverage your pilot group as champions

Leverage the internal users from your pilot testing to act as champions for Spotter. Encourage them to share their positive experiences and assist colleagues, as their advocacy can significantly boost initial adoption.

Develop and share user resources

Create concise user resources like a “Spotter FAQ” or a quick-start guide. Include a list of well-coached example questions relevant to their use cases to help users achieve early success with Spotter.

Coach business users effectively

Conduct focused coaching sessions that teach users how to interact with Spotter for their specific business questions and workflows. Crucially, emphasize the importance of verifying the answers Spotter provides. Encourage users to critically assess this information. Instruct them how to provide feedback (by using available rating tools, and noting issues for the coaching team) or try rephrasing their question if an answer seems incorrect. This empowers them to contribute to Spotter’s refinement.

Establish clear support channels

Clearly define and communicate how end users can get help with Spotter, whether for questions, technical issues, or improvement suggestions.

Communicate the launch effectively

Formally announce Spotter’s availability to the target user groups. Highlight the benefits for their roles, explain how to access Spotter, and point them to coaching resources and support channels. Consider a phased rollout, starting with enthusiastic or high-impact teams.

Completing these preparation steps will significantly enhance the adoption, perceived value, and overall effectiveness of Spotter within your organization, setting the stage for a successful deployment.

Step 8: Launch, monitor, and iterate

Once you’re confident, launch the use case with your end users. Coaching doesn’t stop at launch; ongoing monitoring is key to improving accuracy and adoption.

Roll out Spotter according to your communication plan and monitor how Spotter is being used and how it’s performing. Use the Spotter conversations Liveboard to track key metrics such as:

  • Most common questions asked by users.

  • Perceived accuracy of responses (for example, through upvoted and downvoted answers or other feedback mechanisms).

  • Coverage gaps in your current coaching (that is, questions Spotter struggles with or answers incorrectly).

  • User adoption rates across different teams or personas.

  • Common terms or phrasing used by users that may not yet be part of your coaching.

The insights gained from monitoring, coupled with changes in your business environment, will guide your ongoing efforts to enhance Spotter.

Usage driven refinements (proactive and iterative)

Identify coaching opportunities by regularly analyzing the data and feedback gathered from monitoring.

Pay close attention to “downvoted” conversations, queries where Spotter provided no answer, or instances where users rephrased their questions multiple times. These are excellent candidates for new reference questions, business term definitions, or refinements to existing coaching.
Event-driven updates (reactive and planned)

Beyond continuous usage-based refinement, plan to revisit and update your Spotter coaching in response to specific business or data changes. Key triggers include:

  • New business metrics, KPIs, or important concepts being introduced in your organization.

  • Changes in the definitions or calculation logic of existing metrics or business terms.

  • Significant modifications to your data model, or when new data sources, tables, or critical columns are added that users will want to query via Spotter.

  • Direct user feedback (outside of automated monitoring) that highlights consistent misunderstandings or areas for improvement.

  • Evolution in your overarching business strategy or reporting requirements, necessitating changes in how data is interpreted or presented.

By establishing a cycle of launching, actively monitoring, and iteratively refining your coaching based on both real-world usage and business evolution, you ensure that Spotter remains an accurate, relevant, and increasingly valuable tool for your organization.

Step 9: Troubleshoot common coaching scenarios

Even with careful data modeling and dedicated coaching, you might occasionally encounter scenarios where Spotter’s responses aren’t what you expect, or you might have questions about the best coaching approach. This section covers common issues and provides guidance.

Table 3. Quick troubleshooting: Common issues and initial actions
Issue What to do

Spotter answers incorrectly

Check data model and coaching logic

Is the underlying data model (joins, column names, data types) accurate for the question?

Verify the logic in your reference terms and business terms, and check if the user phrasing is very different from your coaching.

Consider ambiguity

Is the user’s original question ambiguous? Sometimes rephrasing the question or adding a clarifying business term can help.

A business term isn’t being applied

Confirm that the exact phrase used in the question matches to the defined business term (for example, “win rate” is not the same as “win_ratio”).

Unable to apply a default filter or data granularity

Currently not supported. Consider using specific reference questions for queries that users might ask, or explore data modeling options.

Conflicting results with slight variation

You might have duplicate or conflicting coaching examples. Try to remove or clarify them. If stuck, contact ThoughtSpot Support.

For more examples on how to refine your coaching, see How many examples and when?



Was this page helpful?