Who should take this exam?

Developers with a minimum of three months of experience working with datasets and LookML as well as familiarity with SQL and BI tools.

Recommended Looker knowledge:
  • Maintain and debug LookML code
  • Build user-friendly Explores
  • Design robust models
  • Define caching policies
  • Understand various datasets and associated schemas
Recommended Looker tools:
  • Looker IDE
  • Text editor
  • Looker’s SQL Runner
  • Content Validator
  • LookML Validator
  • Version control

Exam details:

  • Registration fee: $250 ($200 introductory pricing for 2019)
  • Number of questions: 65 multiple choice or multiple response questions*
  • Exam duration: 100 minutes
  • Score needed to pass: 750 (scale of 100-1000)
  • References: no hard-copy or online reference materials are allowed during the exam
  • Prerequisite: none required (course attendance highly recommended. For more details on courses, please see the recommended training and documentation section).

* There will be additional unscored questions included on each Looker exam. These questions gather exam data for future use. These questions will not impact your score, and additional time is factored in to account for this content.

Recommended training and documentation

Looker training and documentation
Get ready for development
Write LookML

What does the Looker LookML Developer Certification exam cover?

1.0 Model management — 39%
  1. Troubleshoot errors in existing data models. For example:
    • Determine error sources.
    • Apply procedural concepts to resolve errors.
  2. Apply procedural concepts to implement data security requirements. For example:
    • Implement permissions for users.
    • Decide which Looker features to use to implement data security (e.g., access filters, field-level access controls, row-level access controls).
  3. Analyze data models and business requirements to create LookML objects. For example:
    • Determine which views and tables to use.
    • Determine how to join views into Explores.
    • Build project-based needs (e.g., data sources, replication, mock reports provided by clients).
  4. Maintain the health of LookML projects in a given scenario. For example:
    • Ensure existing contents are working (e.g., use Content Validator, audit, search for errors).
    • Resolve errors.
2.0 Customization — 30%
  1. Design new LookML dimensions or measures with given requirements. For example:
    • Translate business requirements (specific metrics) into the appropriate LookML structures (e.g., dimensions, measures, and derived tables).
    • Modify existing project structure to account for new reporting needs.
    • Construct SQL statements to use with new dimensions and measures.
  2. Build Explores for users to answer business questions. For example:
    • Analyze business requirements and determine LookML code implementation to meet requirements (e.g., models, views, join structures).
    • Determine which additional features to use to refine data (e.g., sql_always_where, always_filter, only showing certain fields using hidden: fields:, etc.).
3.0 Optimization — 18%
  1. Apply procedural concepts to optimize queries and reports for performance. For example:
    • Determine which solution to use based on performance implications (e.g., Explores, merged results, derived tables).
    • Apply procedural concepts to evaluate the performance of queries and reports.
    • Determine which methodology to use based on the query and reports performance sources (e.g., A/B testing, SQL principles).
  2. Apply procedural concepts to implement persistent derived tables and caching policies based on requirements. For example:
    • Determine appropriate caching settings based on data warehouse’s update frequency (e.g., hourly, weekly, based on ETL completion).
    • Determine when to use persistent derived tables based on runtime and complexity of Explore queries, and on users’ needs.
    • Determine appropriate solutions for improving data availability (e.g., caching query data, persisting tables, combination solutions).
4.0 Quality — 13%
  1. Implement version control based on given requirements. For example:
    • Determine appropriate setup for Git branches (e.g., shared branches, pull from remote production).
    • Reconcile merge conflicts with other developer branches (e.g., manage multiple users).
    • Validate the pull request process.
  2. Assess code quality. For example:
    • Resolve validation errors and warnings.
    • Utilize features to increase usability (e.g., descriptions, labels, group labels).
    • Use appropriate coding for project files (e.g., one view per file).
  3. Utilize SQL Runner for data validation in a given scenario. For example:
    • Determine why specific queries return results by looking at the generated SQL in SQL Runner.
    • Resolve inconsistencies found in the system or analysis (e.g., different results than expected, non-unique primary keys).
    • Optimize SQLs for cost or efficiency based on business requirements.