<![CDATA[Looker | Blog]]> /blog Looker blog discover.looker.com Copyright 2019 2019-05-19T19:24:58-07:00 <![CDATA[Steps for Getting Started with Data-Driven Marketing]]> http://looker.com/blog/steps-for-data-driven-marketing http://looker.com/blog/steps-for-data-driven-marketing No matter your business size or industry, doing more of what’s working and less of what’s not is a no-brainer. With data-driven marketing, you can not only uncover insights on your prospect and customer behaviors, but you can further leverage those insights to inform the strategies that impact the success of your business.

Check out these steps for getting started with data-driven marketing at your organization:

Step 1 - Define Your Goals

Setting your team up for success starts with defining the goals and outcomes you want to achieve. Whether you’re looking to increase the number of new customers acquired, increase website visits, or grow marketing-generated revenue, setting specific and measurable goals will help your team map out the strategies that will lead to a successful outcome.

When developing a plan for the entire team, select marketing channels in which actions can be attributed back to the goal when building out a strategy. Additionally, consider the available budget when mapping out the allocation of resources to help guide effective goal-related actions rather than inefficient activities. For example, a social media campaign creates tangible data points, while a billboard campaign has theoretical data points.

Step 2 - Identify The Methods of Tracking

Once the goals have been set, use them to outline the questions you need answers to. Questions like:

  • What should we measure?
  • Do we have that data?
  • If yes — how do we go about accessing it?
  • How do we use that data as insightful information we take action with?

This will help you determine the data you need to measure and track towards goal performance over the period of time you aim to achieve it.

Step 3 - Analysis, Attribution, and A/B Testing

With goals, metrics and tracking methods set, campaigns contributing to the overall goals can commence. As data gets collected from these various campaigns, your team(s) can begin to analyze the outcomes of their strategies in action.

During this time, the data may reveal an underperforming campaign that is not meeting the predetermined benchmark of success metrics. While removing the dead weight and continuing forward without this activity may seem like the most efficient, cost-effective option, allowing for time to assess the information may uncover variations to the campaign that will save you from needing to remove it at all.

Attribution Assesses What’s Driving Results

Understanding which efforts are actually contributing to the overall goals is an important factor for any successful analytics-driven marketing initiative. To be sure that the methods you use to assign attribution align with the goals of the organization as a whole, keep in mind that:

  1. The attribution model(s) you use should be understood and easy to communicate by those using it.
  2. You should use multiple attribution methods only as needed.
  3. An attribution model is a way to set benchmarks and performance standards, and it is not how to track the outcome of every marketing dollar spent.
  4. If the model isn’t making the jobs of people who use it easier, it’s time to reassess.

Some of the different approaches to attribution include:

  • First Touch Attribution
    First touch attribution credits all of the success from an achieved outcome to the first campaign it interacted with. However, this method only provides limited visibility into the other activities that may have affected the positive outcome, leaving your team with little information to strategize from when formulating new questions to ask of the data.

  • Last Touch Attribution
    Last touch attribution assigns all the credit of a successful outcome to the final campaign associated with it. While this method is easy to implement, especially when re-engaging with existing accounts in your database, the long-term effects of this method limit your ability attribute marketing efforts towards new leads in the funnel, which can cause bigger challenges down the road.

  • Multi-touch Attribution
    Much like it sounds, multi-touch attribution spreads the credit across all campaigns. You can do this through linear distribution, which gives each campaign an equal cut of the credit from a successful outcome. Another way to do this is with weighted distribution, in which your teams can assign credit to the involved campaigns based on factors like time and campaign position as it relates to goal conversion.

By understanding the various factors contributing to the outcomes needed to achieve the set goals, your teams can begin to iterate on the tactics, tests, or campaigns themselves that are generating the best results. Proper attribution allows you to continue refining the questions you want answers to, leading to more specific measures of success that can be used to drive more positive campaign outcomes.

Test Strategically

A/B testing are experiments based on differentiated campaign variables, which is why they are some of the most fun and most tricky parts of any analytics-driven marketing initiative. While they encourage creative and critical thinking, A/B tests must be developed with the overall learning goal in mind. From colors to wording, page flows, and heatmaps, developing and deploying A/B tests will help your teams generate better insights from variables in a given campaign.

Step 4 - Measure And Share Results

Whether your tests run for days, weeks, or an extended amount of time, assessing the results of these tests and the effect of your sample size on the results will allow your teams to measure the impact of their activities against the strategies tied to the overall goal. Sharing the results of these intentional marketing efforts with the entire organization will help educate and demonstrate the impact data-driven marketing initiatives can have on the business’s bottom line.

Go Further With Data-Driven Marketing

Download the Analytics-Driven Marketing for Action ebook for case-study examples of what marketing analytics in action looks like, or reach out to our teams to learn more about how Looker can help realize data-driven marketing at your organization.

<![CDATA[Stepping Out of The Data Silo - The Evolution of Digital Marketing Metrics and Analytics]]> http://looker.com/blog/evolution-digital-marketing-metrics-analytics http://looker.com/blog/evolution-digital-marketing-metrics-analytics Anyone tackling digital marketing today knows that they are responsible for finding the right balance of art, science, and magic to maximize the value of their marketing spend. It stands to reason, then, that the more insight that can be gained from digital marketing analytics and the more timely these insights can be made available, the better decisions one can make when allocating where, when, and how much — or little — to invest in digital marketing spend.

How Digital Marketing Metrics Were First Measured

Early on in digital marketing, success was measured by the number of eyes that could be attributed to a given ad. If you could get an ad in front of an increasing number of people — or even better, get them to click on it — the campaign was considered a success. Metrics like Number of Impressions, Cost per Thousand Impressions (or CPM), Clicks, and Cost Per Click were among the key metrics used for determining the effectiveness of campaigns and accuracy of marketing KPIs. However, it soon became apparent that there was not always a direct correlation between increasing the number of clicks and the impacting bottom line.

To complicate things further, the number of digital marketing channels continued to increase to include Google Ads, Facebook, Yahoo, YouTube, and so on — each with its own completely separate set of metrics. To get the full picture of the data, digital marketing teams were left with the choice to either work with disparate silos of data, or go through the painful process of manually combining data into spreadsheets.

To make better use of digital marketing data, today’s marketers need to be able to look at more than just impressions and clicks. Marketers need to know what happens after a person clicks on an ad and understand what — if any — value that provides to the business overall.

Digital Marketing Attribution Changes for Sales and Marketing

Ideally, any action a person takes following a click on a digital marketing ad will lead that person further into your business’s funnel. Measuring the impact of that action as you get further into the funnel can vary based on your sales cycle. For a standard business-to-consumer website, this isn’t too difficult to do with the correct attribution. But for more complex sales cycles where a click may result in a lead being created in a CRM, followed by a product trial, then a purchase, etc., tracking attribution can quickly go from easy to confusing.

When done correctly, the ability to measure the success of a campaign throughout the entire customer journey with data attribution is priceless. Realizing this, Google and Salesforce teamed up to give marketers a unified view of their Google Marketing Platform products alongside Salesforce. In addition, Google created products such as GA360 to help minimize siloed views of data for better analysis. Because of these advancements, it’s become possible for marketers to focus on metrics that show them how their digital marketing efforts are affecting the bottom line and driving success for the organization throughout the entire customer journey.

Centralizing Digital Marketing Metrics, Attribution, and Analyses

Another challenge associated with digital marketing is the ability to aggregate data in a way that makes analysis seamless yet still reliable. Many times, moving data can result in lost time or information, neither of which help marketers understand how their efforts are or are not working. Using Google BigQuery is one way to solve this challenge.

By moving data from the Google Marketing Platform into BigQuery, marketers have the ability to perform complex analyses that may not have been possible with other solutions. And when you combine this data with data from mission-critical systems beyond Salesforce, the insights that can be gained are limited only by the imagination. This results in being able to better answer questions such as:

  • Which campaigns result in customers with the highest Customer Support requirements?
  • Which campaigns result in customers with the highest renewal rates?
  • How does the effectiveness of our Google Ad spend compare to what we are spending on other advertising platforms?

Digital Marketing Analytics with Looker

Just as data needs to help tell a full story, metrics need to do more than report — they need to provide insight and drive action.

To empower marketers with the necessary data to drive those actions, Looker’s pre-built applications, made for specific marketing use cases such as Digital Marketing, allow for analyses and reporting across digital marketing sources such as Google, Facebook, Bing, Pinterest, and LinkedIn. With one, centralized location to assess active campaigns, conversion rates, and ad spend, marketers can make real-time decisions based on the most effective methods and messages to help improve upon their marketing approach and strategies.

Increase The Impact of your Digital Marketing Efforts

While every organization’s digital marketing strategy is different, by having ready access to the digital marketing metrics that impact the business as a whole, real dollar ROI can be accurately assigned to the campaigns that work and help shed light on those don’t.

Download the Analytics-Driven Marketing for Action ebook or reach out to our team to learn more about increasing the impact of your digital marketing efforts.

<![CDATA[Data of Thrones: Direwolves, Unique Deaths, and Word Trends... Oh my!]]> http://looker.com/blog/data-of-thrones-vi-more-insights-game-of-thrones http://looker.com/blog/data-of-thrones-vi-more-insights-game-of-thrones Like Hot Pie excitedly sharing his new cooking tips with Arya, we’re eager to share some more insights from the #dataofthrones that have been surfaced in the weeks since introducing the Game of Thrones data dashboard and the Gender of Thrones analysis.

So whether you’re still processing the most recent events of this season, in denial about Game of Thrones already being halfway over, or are here for all-things data — you’re in the right place. Check out these GoT insights and share yours with us on the Community and social media #dataofthrones.

Ghost, where have you been?

Almost from out of nowhere, season eight blessed us with the on-screen return of Ghost. This happy but sudden occurrence had us wondering — when was the last time we’ve seen a direwolf on Game of Thrones? We consulted the data and got all the feels.

Explore here

Deaths by house throughout the series

Looker’s own Rufus Holmes shared this cool Look comparing deaths by house season-over-season. If you filter on only house Baratheon and Stark, there seems to be a pattern of one season with major deaths for house Stark, followed by a season with more deaths in house Baratheon.

Curious — I wonder if that will carry on into the end of the series... what do you think?

Explore here

Common and uncommon word frequency

Community Member Haley Baldwin found some interesting trends associated to word frequency in Game of Thrones. As Haley shared:

“The upward trend of NORTH and DRAGON, peaking in season 7, makes a lot of sense and was interesting to see! Dragons became really important to the plot last season and there was also a lot of chanting “King in the North!” in Winterfell.

Explore here

Said with honor

Not that we didn’t already think Brienne was one of the most — if not the most — honorable GoT character, we consulted the data to confirm. The conclusion — Brienne = just as amazing as we thought.

Explore here

The evolution of one-time-only GoT deaths

In addition to word frequency trends, Haley shared these interesting results about GoT deaths throughout the season —

“When you look at the types of death in Game of Thrones, it’s clear that most named characters died by some form of stabbing. I wanted to find out which types of death had just one victim. In this table you can see that the most unique deaths happened in Season 1 and they’ve been declining since then. The violence definitely hasn’t decreased overall, but it seems that named characters have become less likely to die in unique ways.”

Explore here

Who will (or already did) battle karma before the end of Game of Thrones?

Right before the season eight premiere, Looker’s Sloane Ansell whipped up this word cloud of the remaining GoT character who’ve killed throughout the season yet still had not been cut down themselves. If you made bets or predictions for the final season, I hope you had a chance to consult this data...

Explore here

Explore the Game of Thrones data

Ready to dive in yourself? Start exploring our Game of Thrones dashboard and be sure to share what you find!

Disclaimer: Game of Thrones belongs to HBO and is not affiliated with Looker in any way.

<![CDATA[iFrame Sandbox Permissions Tutorial]]> http://looker.com/blog/iframe-sandbox-tutorial http://looker.com/blog/iframe-sandbox-tutorial Understanding iFrame Sandboxes and iFrame Security

Embedding third-party JavaScript in web applications is a tale as old as time. Whether it’s dropping a widget onto your web page or including custom content from a client in your cloud application, it’s something that many developers have encountered in their career. We all know about the iframe element in HTML, but how much do we really know about how it works? What are the security concerns associated with running code inside of an iframe and, furthermore, how can the HTML5 sandbox attribute on the frame alleviate these concerns?

The goal of this tutorial is to walk through the various security risks associated with running third-party JavaScript on your page and explain how sandboxed iframes can alleviate those issues by restricting the permissions it is allowed to run with.

In this post, we’ll demonstrate setting up a demo application from the ground up that will simulate running JavaScript coming from a different origin. What we should end up with is a sandboxed environment in which we can execute any arbitrary JavaScript and still sleep well at night, knowing our host application will be safe from harm.

With all of that in mind, the guided walkthrough will consist of the following parts:

  1. Setting up two node servers to simulate two different origins
  2. Embedding the content of our client page in an iframe on the host page and investigating what the client iframe is and is not allowed to do
  3. Applying the sandbox attribute to the iframe and exploring the various options for the sandbox.

Let’s get started!

Step 1: Setting up the Servers for our Demo Application

To simulate executing code from a different origin, we are going to set up two node servers — one which we’ll call the host and second which we will call the client. We can do this using node’s http library to listen to and serve from two different ports.

// server.js
const http = require('http');
const hostname = 'localhost';
const host_port = 8000;
const client_port = 8001;

const host_server = http.createServer((req, res) => {
 res.statusCode = 200;
 res.setHeader('Content-Type', 'text/plain');
 res.end('This is the Host Server\n');
}).listen(host_port, hostname, () => {
 console.log(`Server running at http://${hostname}:${host_port}/`);

const client_server = http.createServer((req, res) => {
 res.statusCode = 200;
 res.setHeader('Content-Type', 'text/plain');
 res.end('This is the Client Server\n');
}).listen(client_port, hostname, () => {
 console.log(`Client running at http://${hostname}:${client_port}/`);

Save this JS file to whatever name you’d like - I called it server.js. Then, to start our server, we can simply run:

node server.js

This should start two different http servers, one on port 8000 and the second on port 8001. To test that it is working, you can individually visit your localhost on port 8000 and 80001, which should look like this:


Even though they are both running on localhost, the same-origin policy that browsers implement operate on a protocol-host-port tuple that will only be true if the protocol, the hostname, and port number all match up. In this case, the protocol and host are the same, but since the ports are different, these will be considered to be different origins.

Of course, just having a hard-coded response won’t get us very far. We’ll need to be able to serve both HTML and JS for this demo. To do this, I whipped up a function to serve assets from a given folder.

function serveAsset(rootPath, url, res) {
 // default root route to index.html in the folder
 if (url === '/') url = 'index.html';

 const filePath = path.join(__dirname, rootPath, url)
 const readStream = fileSystem.createReadStream(filePath)
   .on('error', function() {
     res.statusCode = 404;

 if (/^.*\.js$/.test(url)) {
   res.setHeader('Content-Type', 'text/javascript');
 } else {
   res.setHeader('Content-Type', 'text/html');

This function will take in a root folder path, a url, and the response object. If the file is not found, it will return 404. If it is found, it will set the header to be text/javascript or text/html, depending on the file suffix. To get this to work, we need to include two more dependencies at the top of the file:

const fileSystem = require('fs');
const path = require('path');

Fun fact - Since we’re just using built-in node libraries, we do not have to install anything via npm! Once you instantiate fileSystem, path, and our asset function, go ahead and update your servers as well to call serveAsset.

const host_server = http.createServer((req, res) => {
 serveAsset('host', req.url, res)
}).listen(host_port, hostname, () => {
 console.log(`Server running at http://${hostname}:${host_port}/`);

const client_server = http.createServer((req, res) => {
 serveAsset('client', req.url, res)
}).listen(client_port, hostname, () => {
 console.log(`Client running at http://${hostname}:${client_port}/`);

These now look very similar. The only difference is that the host_server will look for its assets in the host folder and the client_server will look in the client folder. If we were to restart our server now, we would see the following error message:


This is because our serveAsset function is looking in either the host or client folder for an asset to serve, and we haven’t created them yet! Let’s create both of them, each with an index.html and a JS file.

» mkdir host; mkdir client; touch host/index.html; touch host/host.js; touch client/index.html; touch client/client.js

Our file structure should look like this:


Now, if we start our server and visit our localhost location, we no longer get the 404, which means our server found the file! — but it has no content yet. To get some content in, let’s start with something very simple. For the host, we simply have the HTML as:

<!-- index.html -->
   <h1>Host Page</h1>
   <p>Host message container</p>
   <script type="text/javascript" src="host.js"></script>

And the JavaScript as:

alert('hello from the host')

The client content is exactly the same, just with the word host changed to client. If we restart our server now, we should be able to go to both http://localhost:8000/ and http://localhost:8001/ and see our content in action! Each page should send an alert from the JS file and then render our html content to the page.

Each node server is serving an index.html file which includes a JavaScript file

Each node server is serving an index.html file which includes a JavaScript file

Step 2: Embedding the Client in the Host without Sandboxing and Investigating its Permissions

With our two servers running, we are now ready to begin testing some iframe scenarios. Before we do that, let’s add a second file to our host so that we can compare the permissions of an iframe from the same origin and an iframe from a different origin.

touch host/hosted-client.html; touch host/hosted-client.js

Fill in the content of these files with the same content as we used for our other html/JS pairs. We’ll call this one the “Hosted Client,” meaning an iframe client coming from the same origin as our host.

Once we do that, back in our host/index.html we can iframe both our same-origin client and our different-origin client.

<iframe width=400 height=300 src="http://localhost:8000/hosted-client.html"></iframe>
<iframe width=400 height=300 src=""></iframe>

Please note that we use localhost for the hosted-client and for the other. This will become important in the cookies section below. Refreshing the host page, you should now see two iframes, each with the content of our individual HTML files.

If you have alerts in your JS files like I do, you should have seen that each individual file triggers an alert at the top level of the browser. Should it be able to do this? That question brings us to our first area of concern — pop-ups and dialog boxes.

Pop-ups and Dialog Boxes

JavaScript has three different functions that trigger a popup — alert, prompt, and confirm. Each of these open up a dialog box at the top of the browsing context, regardless of whether it comes from the top-level window or not. The scary thing about these dialogs, as can be found in the alert documentation, is:

“Dialog boxes are modal windows — they prevent the user from accessing the rest of the program’s interface until the dialog box is closed. For this reason, you should not overuse any function that creates a dialog box (or modal window).”

I am sure that most of us have, at some point in our lives, been forcefully redirected to a spammy site against your will that then bombarded you with these types of dialogs. Even when you try to close out of it, it just pops open another one. This annoying behavior completely blocks you out of using a site and sadly, it’s incredibly simple to reproduce. Try adding this to your client.js file:

(function unescapablePrompt() {
 if (window.confirm("Do you want to win $1000?!?!")) {
   /* Open some spammy webpage or redirect */
 } else {

Now, when you visit your host site, you get this:


These prompts make it impossible to ever interact with the page and you have to close the entire tab or kill the JavaScript execution to get rid of it. The worst part is that regardless of whether your embedded content is coming from a different host or the same origin, this behavior is exploitable.

You might think that you could just do something like this to get rid of it:

alert = prompt = confirm = function () { } // does not work!

The problem with this is that each iframe operates within its own nested browsing context, so overriding the functions at the host level will not affect the functions at the frame level.

Fortunately, sandboxing can come to our rescue here, which we will see later in this post. For now, let’s move on to whether or not the iframe can navigate the page away from the current one.

Step 3: Top-level Window Navigation and Opening New Tabs

Let's now examine how and if the iframes are able to change the url of the top-level window and if they’re able to open a new window.

There are two different methods that we want to test here.

  1. window.open for opening new windows and tabs, and
  2. window.location for navigating the page away from the current url.

Helpful Note: iframes can reference its top-level window using window.top. Similarly, it can reference its parent’s window with window.parent. In our case, they do the same thing.

Let’s remove all the code in client.js and replace it with:

window.top.location = 'http://localhost:8001'

This will attempt to redirect the top-level window to the client host. If we run this, we get the following error:


That’s great! It blocks the automatic redirect. But wait — what’s that last part? “… nor has it received a user gesture.”

What does that mean? Does it mean that a user-initiated gesture can navigate the window? Let’s try it.

// client.js
function clickNav () {
 window.top.location = 'http://localhost:8001'
<!-- client/index.html -->
<a href="" onclick="clickNav()">Navigate me</a>

If we add this code to our JS and HTML respectively, it will add a link to the client page. When we click it, the page navigates!


As it turns out, a user-initiated action can navigate the top-level window, which is the basic idea behind clickjacking. A typical clickjacking attack will put transparent click boxes over a page and then “hijack” the click to redirect the page to a different url. window.open works the same way.

// client.js
function clickNav () {

If we have this, the browser will block the pop-up request of the outer function call, however, when we have it in a click handler, it will open the window regardless of whether we blocked pop-ups or not.

The other part of the above error message saying the iframe “does not have the same origin as the window it is targeting” means our iframe is the same origin as the host and it has full access to redirect the page, click-action or not.

// hosted-client.js
window.top.location = 'http://localhost:8001'

If you run this, the browser will allow the redirect to happen since it is the same origin. This is not the case for window.open. Even if it is from the same origin, the browser will block the window.open unless you explicitly tell the browser to allow the pop-up.

Cookies and Browser Requests

The final thing we are going to look at is browser cookies. Before getting started, make sure your hosted-client iframe is pointed at localhost and your client.js iframe is pointed at

<iframe width=400 height=300 src="http://localhost:8000/hosted-client.html"></iframe>
<iframe width=400 height=300 src=""></iframe>

We need to do this because cookies care about domain and ignore the port. Once you have checked this, let’s set a cookie on the host.

// host.js
document.cookie = "session_id=A38XJISDASDW120"

This is representing a session ID, something that is often included in requests. Let’s see if our iframes can access the cookies.

// in client.js
console.log(document.cookie) // ""

// in hosted-client.js
console.log(document.cookie) // session_id=A38XJISDASDW120

As you can see, the client cannot access the cookies since it is a different origin, but the hosted-client can. Let’s try to make a request using the fetch API.

// hosted-client.js
var myRequest = new Request('http://localhost:8000');
fetch(myRequest, {
 method: 'GET',
 credentials: "include"
}).then(function(response) {

When we do this, we get a 200 response.

Response {type: "basic", url: "http://localhost:8000/", redirected: false, status: 200, ok: true, ...}

That means the server accepted the request and gave us a response, but did it send through the cookies? We can check this on the server side. Let’s add a console log to our host server request handler.

const host_server = http.createServer((req, res) => {
 serveAsset('host', req.url, res)
}).listen(host_port, hostname, () => {
 console.log(`Server running at http://${hostname}:${host_port}/`);

This will output the headers when we make a request to our host server. Make sure you restart the server after you add this line, and then reload your page and look for the request coming from the hosted-client. It looks like this:

 host: 'localhost:8000',
 connection: 'keep-alive',
 'user-agent': 'Mozilla/5.0 (Macintosh......',
 accept: '*/*',
 referer: 'http://localhost:8000/hosted-client.html',
 'accept-encoding': 'gzip, deflate, br',
 'accept-language': 'en-US,en;q=0.9',
 cookie: 'session_id=A38XJISDASDW120'

As you can see, it sends the cookies through. If the server were to use just the session ID to authenticate the request, then it would think this is a legitimate request.

Step 4: Applying the Sandbox Attribute to the iframe

So far, we’ve identified four areas of concern when working with iframes.

  1. They can exploit pop-up dialog boxes to prevent interaction with the website
  2. Navigating the top-level window through clickjacking even on different-origin iframes
  3. Navigating the top-level window when the origin is the same even without user interaction
  4. Same-origin iframes can make requests with cookies.

Now we’re going to begin making use of the sandbox attribute for iframes, introduced in HTML5. When added to an iframe, the sandboxed iframe restricts pretty much all scripts and browser behavior of any kind. It is not until we add the permissions in a space-separated list that we enable the exact permissions we want to set. To see its initial state, add the attribute as an empty string to both of our iframes.

<iframe sandbox="" width=400 height=300 src="http://localhost:8000/hosted-client.html"></iframe>
<iframe sandbox="" width=400 height=300 src=""></iframe>

When we sandbox the iframe, it blocks all scripts from executing

Sandboxed iframes with no permissions block all scripts from running

Sandboxed iframes with no permissions block all scripts from running

Getting this to work starts by allowing various permissions one at a time . The full list of string values can be found in the iframe documentation under the sandbox section. We will be starting with allow-scripts.

Allowing Scripts

To begin here, let’s clear out our client.js and hosted-client.js and start with a simple console log.

console.log("I executed!")

Without defining any permissions, our sandbox won’t allow a console log to run. We can get our script running by adding the allow-scripts permission to our iframe attribute.


Once you do this, and refresh the page, you should see a console log from each of our pages.

Pop-ups and Modals

One of the concerns we learned about is that an iframe can pop up dialog boxes at the top of the browsing context and prevent the user from interacting with the page. To see if this is exploitable in a sandbox, let’s add an alert to our client script.

alert("hello from the client")

When we run that we get the following error:

"Ignored call to 'alert()'. The document is sandboxed, and the 'allow-modals' keyword is not set."

Even though we are allowing scripts to run, the sandbox still limits a lot of the browser behaviors. In order for the alert to work from the iframe, we would have to add the allow-modals property to the iframe.

sandbox="allow-scripts allow-modals"

Keep in mind that this is an all-or-nothing thing. We cannot allow some popups and block others. This is an okay restriction, in my opinion, and it crosses off our first iframe security concern.

Top-level Window Navigation and Opening New Tabs

Our second and third security concerns are related to navigating the page away from the original URL. We saw that a same-origin iframe could navigate the page without a user interaction, and that a different-origin iframe could do so with user interaction. Let’s try this in our sandbox.

// client.js
function clickNav () {
 window.top.location = 'http://localhost:8001'
window.top.location = 'http://localhost:8001'

This results in the following errors:


With this, we covered two cases at once. The initial command to change the location failed, as did the one on-click. By applying separate permissions to our iframe for each of these cases, we can allow any navigation with allow-top-navigation and user-activated navigation with allow-top-navigation-by-user-activation.

sandbox="allow-scripts allow-top-navigation-by-user-activation"

When we turn this on, the different-origin iframe can redirect the page upon user action. The case is the same for same-origin iframes, where you can explicitly set the navigation permissions, regardless of the origin.

Cookies and Browser Requests

The final concern to address is the ability to access cookies and make requests with same-origin iframes. Let’s try accessing the cookies with a sandboxed iframe.

// hosted-client.js

Unlike last time, this results in the following error:

Uncaught DOMException: Failed to read the 'cookie' property from 'Document': The document is sandboxed and lacks the 'allow-same-origin' flag.

Similarly, when we try to make a request with the same request code as the previous section, we get a different error.

Failed to load http://localhost:8000/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

The above error is the browser blocking our request because we no longer have the same origin. This is because the sandbox property sets the origin of the frame to null, meaning it will now be a cross-origin request, even though the iframe is hosted on the same domain.

Adding the allow-same-origin sandbox attribute will prevent both of these errors from occurring. However, you should be careful and make sure you have complete control over the content of the frame before using it. As noted in the Mozilla iframe documentation:

“When the embedded document has the same origin as the main page, it is strongly discouraged to use both allow-scripts and allow-same-origin at the same time, as that allows the embedded document to programmatically remove the sandbox attribute. Although it is accepted, this case is no more secure than not using the sandbox attribute.”

Generally speaking, if you find yourself needing both allow-scripts and allow-same-origin for your sandbox, you should ask yourself why you are iframing in the first place, and whether or not having the sandbox property is appropriate.

Putting It All Together: How We Use iframes at Looker

As an example of a practical use of this, here at Looker, we use iframes to allow customers to create and run their own custom visualizations within our application.

Without any one way to vet every single line of code our customers could write for their custom visualizations, we needed to create a secure execution environment to run the code in. A diagram of this environment can be seen below.

Looker explore page using an iframe to render custom data visualization

Looker explore page using an iframe to render custom data visualization

We leverage the postMessage API to pass the data in and to receive back any events or errors that the visualization produces. Given the restrictions of the sandboxed iframe, it is not able to make calls outside of its own frame, nor is it able to read or modify anything about the parent page. This let’s us rest assured that both our application and our customers’ data is safe and secure.

Wrapping Up

I hope you found this post helpful as you address security concerns related to sandboxed iframes. By walking through this tutorial, you should now have a better understanding of:

  • How sandboxed iframes without allow-modals permission prevent user interaction on the page
  • How sandboxed iframes without the allow-top-navigation or the allow-top-navigation-by-user-activation properties can alleviate same-origin iframes that can redirect the top-level page, as well as different-origin iframes with some user interaction.
  • Why sandboxed iframes without the allow-same-origin property prevent same-origin iframes from having access to the domain’s cookies and making requests as if they were the host.

If you have any questions about this tutorial, feel free to reach out on the Looker Community. And if you’re curious to learn more about our Looker team, check out our open positions.

Important Notes

<![CDATA[The Podium - May 1st, 2019]]> http://looker.com/blog/the-podium-may-1 http://looker.com/blog/the-podium-may-1 Hello there, good Lookers!

Wishing you all a happy May Day today (wow, it’s already May!) A couple of fun reminders to share —

Whether you’re still spinning out from the most recent episode or you just love all things data, don’t forget to check out our Gender Analysis of Game of Thrones data. Find something that surprised you or your own interesting tidbit? Hop into the Community and share it us and your fellow GoT/data fans.

Also, super excited to share that if you’re a developer, coder, or all-around data lover in Europe, we’re having our first-ever London Hackathon on May 17th! Full details for the event are coming soon, so be sure to register so you can get all the details.

Izzy Miller
Community Manager, Looker

Questions and Answers

Hiding Fields Based On Permissions

Dawid_Nawrot started a major discussion about the best way to conditionally hide some fields from the UI, while still allowing queries containing them to be run by all users. Check out the full thread here.
Looker Community Discussion — Hiding fields based on permissions

YTD / Previous Year Fields

Bens came through with some great examples of how to do Year To Date calculations using a yesno field - awesome stuff, Bens!
Looker Community Discussion — YTD and Previous year dimensions

Community Spotlight — Jonathan Palmer

Our guest this week is @Jonathan_Palmer. Welcome, Jonathan!

What do you do for work?

I’m the Head of BI at GoCardless, a London-based recurring payments platform.

What’s your secret skill or talent?

I can play Sweet Child O’ Mine on the ukulele.

What’s the coolest thing you’ve ever done with Looker?

Driven successful adoption. I was lucky enough to lead taking King from 0 to 800+ users in 9 months, and now at GoCardless we’ve gone from 0 to 270 (85% of the company) highly engaged users in 4 months.

Most useful Looker feature?

Surely the answer has to be LookML? Abstracting SQL is a win for the BI pro and the end user: less code and more flexibility without compromising accuracy.

Pie chart or bar chart?

Histogram :wink:

What’s your favorite part of the Looker Community

It’s great to see a proper Feature Requests section now. Would love to see more people using it and making the product roadmap more democratic.

What’s your biggest feature request?

An extendable and full history i__looker model, out of the box. Understanding Looker usage data is hugely important to driving adoption, so making this as easy as possible would be awesome.

Favorite London Tube stop?

St John’s Wood — because it is the only tube stop with none of the letters of the word ‘mackerel’ in it, obviously.

Community Knowledge Share

Looker Life Hacks

This is the mother of all knowledge sharing! @nicholaswongsg kicked off a mega-thread of members sharing their nifty Looker tricks and tips. Read the whole thing for an instant Looker level-up.
Looker Community Discussion — Looker Life Hacks

Tricky templated filters

Sam_Asher resurrected an old question from 2016 and dropped in some additional knowledge about suggesting one value in a filter, but inserting an associated value in the SQL.
Looker Community Discussion — Tricky templated filters

Join the Conversation

There you have it! Head over to the Community to add your own life hacks and throw Jonathan some :heart:. Share with us your comments about any specific discussions and tips that have helped you out lately. And of course, send in your questions for the next Spotlight feature.

<![CDATA[Increasing Business Intelligence User Adoption with Looker at Adore Me]]> http://looker.com/blog/how-to-increase-business-intelligence-user-adoption http://looker.com/blog/how-to-increase-business-intelligence-user-adoption Achieving user adoption is a challenge that is as unique as the people that make up an organization. There is no exact recipe for it, and no bulleted list you can follow to get there. The reality is, whether you’re switching over to a shiny new tool or revamping an entire process, human beings have a tendency to be resistant to change.

This can make addressing user adoption quite tricky.

At Adore Me, this was one of my most interesting and time-consuming challenges. While I learned a lot of things the hard way, the road to user adoption was also filled with lots of learnings and truly awesome moments. Below are some of the strategies and tactics that worked best for us at Adore Me.

Business Intelligence at Adore Me

Before Looker, our business intelligence solution was an old platform of manual reports and rogue files. It was a typical breadline model, where I was the provider of SQL-generated reports day in and day out. It was taxing, frustrating, repetitive, and cost me many evenings working after hours to provide data reports for people that had no other way of getting them.

Cue Looker.

I was first introduced to Looker an hour before a meeting where I was supposed to teach someone else how to use the platform. As a developer, that “not knowing” part of being handed the keys to something so brand new was terrifying. To my pleasant surprise, Looker proved to be much easier than I expected and opened up a world of possibilities for me. I was able to set up all the models and tables without issue, until I hit a problem I hadn’t anticipated — everyone was reluctant to use it.

As time went on, I gradually shifted away from the “developer” mindset of using a BI tool and opened my eyes to the end-business user perspective. By doing this, I learned a few key lessons as we worked on our user adoption strategy, with some of the most important takeaways being to:

  • Create fewer, simpler choices for end users
  • Avoid data dumps
  • Understand the learning curve
  • Learn to build trust between the data and the Looker platform

Design with your end users in mind

Everyone wants to use a reporting tool because it’s easy. However, if you’re a developer, you don’t specifically need it. But without a reporting tool, business users don’t have any other choice than to constantly ask for the data. Considering the differences in everyone's data-related needs is important when aiming for increased user adoption across the organization.

Put data in context

Data, in general, needs to be connected in order to be valuable and actionable. Think about it — you can’t get insights from simply looking at a list of emails. Instead, people want to be able to answer specific questions from well-connected data, such as “Did the customer like their purchase?” and “Did anyone have a bad experience?” If people can answer questions like these right away and from one central location, there’s a greater chance they’ll stay focused and engaged with the data, rather than losing interest and never coming back to the platform.

Don’t create a data dump

The automatic view creator is a grand thing to have, but when it comes to the user experience, don’t put too much faith in it. When creating a model, less really is more!

Streamline your workflows

Keep in mind that what you’re doing is in fact frontend work and it needs to have a flow and make sense, especially for a non-experienced user. The fewer decisions people have to pick from when choosing an area to start exploring, the better.

Keep data simple

Within explores, if people are seeing all of the data right away during their first exploring experience, it can get overwhelming really fast. While providing numerous choices and anticipating the potential needs of explorers is great, showing too much too early may discourage them from continued and consistent data exploration later on. It’s better to start small, expose less data, and let people request it as they get comfortable exploring on their own.

Avoid redundancy in your datasets

For someone that knows the data, it’s easy to know which of the fields you’ll need in order to get the information you’re after. But for someone who doesn’t know the data, they’re most likely going to choose the first option that pops up in the search for a respective field.

Will it be the best one for them to use based on the information they’re looking for? Maybe, but probably not.

When modeling your data into views and explores, it’s best to model it around answering business questions rather than SQL results. Although it might be the work of a brilliant SQL generator behind the scenes, if you want your instance to be successful and useable by everyone, you need to design it based on your business model, and not your physical one.

Ease the learning curve

Whether you're transitioning out of a similar tool or introducing a completely new platform, you're asking people to learn something new. And since people learn at different rates and in different ways, some may take some longer than others to get settled in and used to the change.

Share the knowledge

When considering the learning curve, keep in mind that it’s not just about learning a new app, but also about learning how to use a completely different method of thinking. For us at Adore Me, Looker meant learning for the first time how data is connected and how it works. We had to learn how to design data experiences instead of just adding fields and filters, which, in Looker, is the easiest part.

Training, Training, Training!

Oftentimes tutorials, videos, and general user-trainings are the most common ways to help alleviate the pains of a new learning curve. However, you should be prepared if these methods don’t completely solve this for everyone at your organization.

I encourage you to find the sweet spot of training that lies between the knowledge that appeals to users and demonstrating this knowledge with a concept they’ll understand. One way to do this — specifically for an eCommerce use case — is to show how to calculate revenue. By using common concepts people are already familiar with, trainings are more likely to stick with users and be remembered as they continue to learn.

And if you’re like us at Adore Me, as you ramp up your training programs, make sure to reach out to the people at Looker. They will be more than happy to provide you with materials you’ll need to train users yourself.

Build trust with the Looker platform

Whenever you transition to a new platform, there will always be the issue of how much trust people will put in this “new data.” While it’s not actually new, until users get to test the water for themselves, they won’t be inclined to believe you telling them that the water isn’t deep, because you already know how to swim, and they don’t.

Remember, you’re working with people, not computers. Sit down with them, let them tell you their concerns, and if they point out inconsistencies, remember that they’re not pointing at you, they just want to make it better.

Transparency is key

Another way to get people to trust the platform and the data is to let them know what goes on behind the curtain. Strive to be as honest as you can about the issues you find and address along the way. If you just fixed a join, let others know! It may cascade into hundreds of reports, and if you share that knowledge, you’re giving people a chance to understand the subtleties of the behind-the-scenes fixes that affect their day-to-day work.

Include people in the process

In addition to transparency, bringing in users early on in the implementation process to ask for their help and feedback is a great way to build trust. Not only will they feel that their ideas and needs are being heard, but if they’re helping you throughout the process, they’re more likely to trust the final product that you give them.

Remove boundaries and increase BI adoption in your organization

At the end of the day, it’s important to realize increasing adoption doesn’t mean developers vs. business users, because you’re both trying to do the very best job you can.

Regardless of whether you’re building off your company's existing culture or setting the stage for a new data-driven era, increasing adoption of a platform everyone can collaborate on will result in a lot less friction as you move forward together.

<![CDATA[Build vs. Buy: Choosing The Right Embedded Analytics Solution]]> http://looker.com/blog/embedded-analytics-solutions http://looker.com/blog/embedded-analytics-solutions Your customers expect data. They already get it in their day-to-day lives: they know when that package will arrive, how much screen time they spend on which apps, and how many Twitter followers they have. When they use your product, they expect the same level of access and analysis into your data.

And it's bigger than adding a dashboard in your product — it's about embedding an experience with data that surpasses your competitors, delights your customers, scales with your product growth, and maybe even creates a new revenue stream. All while minimizing the cost of maintenance going forward.

The promise of embedded analytics tools for growth

Let’s say you are a technology leader. You’ve been tasked with growth. It’s your top-priority initiative. You also have valuable data that your customers or partners would really want. So, you set out to explore embedded analytics solutions for delivering data to drive growth.

The strategic opportunity is clear — you could be the first to deliver a new type of data in your market, or you could deliver data much better than your competitors and win new business from them. From that success, you could quickly expand into new and adjacent markets, much like how Kollective did, and rapidly start to unlock real growth for your organization.

As good as that sounds, you start thinking about the costs associated with these efforts. Your team says your customers only need “a simple dashboard.” You probably can’t justify hiring new engineers for such a simple project, so you’d have to pull resources away from your core competency, who are already over capacity.

How about buying an embedded analytics solution from a vendor to cheaply outsource the problem? Your engineering lead shoots that idea down immediately, sharing horror stories of getting burned badly by analytics vendors in the past...

Uh oh...

The stories are painful. By moving your data into a vendor’s proprietary, vertically integrated solution, you lose transparency into how data transformation and delivery is done. You also lose agility in — and control over — how you can build user experiences with that data.

Then, as you grow your customer base and add new use cases, you consume a ton of resources with the manual maintenance work required to update permissions and metrics definitions for each customer.

Worst of all, the new points of failure introduced by the opaque, heavyweight solution are brittle — the data in the analytics product starts to break. The inaccurate data erodes customer trust in your product, hurts your company’s brand, and could cause real harm to people making decisions from wrong information.

But it’s 2019, and users demand helpful, accurate data. What should you do?

The reality of build vs. buy

Many companies experience these build vs. buy dilemmas. At first, some decide to “DIY” by building their own embedded analytics solution. As the projects suck up unexpected resources, companies discover (the hard way) that the backend powering their “simple dashboard” was hard, slow and expensive to build from scratch and maintain over time. Authentication, permissioning and administration are also particularly challenging, and there’s a whole lot more under the surface —

Looker for Embedded Analytics

The hidden challenges associated with building a data product in-house are why we’ve seen companies choose Powered by Looker (PBL) for their embedded analytics needs. Common benefits we see these organizations experience include growth in their:

  • customer base
  • product lines and revenue streams
  • market competitiveness

The best part: they accomplish this with a lean, agile development team to achieve and sustain a high return on engineering effort.

As an engineer-inspired analytics platform, Looker’s architecture is fundamentally unique. We designed Looker to leverage the power, speed, and unique functionalities of modern databases. We maintain a version-controlled data model to store and manage all data definitions. And we built Looker as a platform, with the intention of powering custom data experiences. This is why PBL is uniquely suited for companies building high stakes data products.

The benefit The Looker difference Bottom line impact
A future-proof architecture No vendor lock-in because Looker is ecosystem agnostic from database to application. Save costs by leveraging your existing tech stack and data investments.
Rapid time to value Rapidly prototype to drive new and better product-market fit in weeks, not months. Accelerate revenue and product engagement.
Customization and control Deliver data exactly how you want it, with custom UIs and visualizations. Try it out here. Reduce churn with engaging user experiences.
Scalable, minimized maintenance Manage metric definitions and permissions in one place and have changes propagate everywhere automatically. Decrease costs by minimizing maintenance overhead.
Complete data, simply Maintain the richness of your data while abstracting away the complexity of serving it up accurately. Reduce churn and account management costs with engaging, self-serve data exploration.
Efficient development Use the modern git development workflows and trusted release processes you love. Save costs with lean, agile development across teams.
Flexible content management Let account managers and product managers build content to give time back to engineers and analysts. Focus resources on your core competency and reduce account management costs.
Peace of mind from always accurate data Keep your data in-database and query it in real-time so your data is always accurate and fresh. Grow trust in your brand and reduce churn risk.
New streams of value Offer richer functionality at premium prices to incentivize engagement and upgrades. Grow net new and upgrade revenues.
Excellent customer support Leverage our expert professional services and support teams along the way. Reduce risk with a team of experts.

Learn more

The rise of new database and analytics technologies have finally made it possible to build data products and experiences in the way you might expect from modern software development platforms.

Learn more about what is possible with modern embedded analytics in our Embedded Analytics whitepaper, in this interactive demo built entirely on the Looker platform, or reach out to our team to learn more about embedded analytics with Looker.

<![CDATA[5 Benefits of Having a Dog-Friendly Workplace]]> http://looker.com/blog/5-benefits-dog-friendly-workplace http://looker.com/blog/5-benefits-dog-friendly-workplace I love dogs. I think about my dogs constantly. I greet dogs when I walk by them more comfortably than I greet their owners. When talking about any dog, I am likely to give them the honorific title “Sweet baby [name].” Basically, I am obsessed with dogs.

Having dogs allowed in the office is truly an awesome privilege, but not everyone shares my overwhelming enthusiasm for our canine friends — dogs can be messy and a little distracting. In some of our Looker offices, we have a bring your dog to work policy that ensures a comfortable environment for everyone. Pet owners are instructed to be responsible for their dog (and their dog’s messes), keep to designated dog-friendly parts of the office, and to be considerate and respectful towards their co-workers. With these rules in place, working in a dog-friendly workplace has some great benefits for employees...

1. Improves Health and Morale

Pets have been shown to have a positive effect on their owner's well-being. On a physical level, dogs are good for your heart (aww). Dog owners tend to have lower blood pressure, cholesterol, and triglyceride levels, partly because of the increased physical activity that comes with having a dog. On a psychological level, pets provide companionship and can help with depression and anxiety by improving socialization and self-worth. And studies have shown that even if you don’t own a dog, simply petting one can lower a person's blood pressure.

2. Fosters Collaboration

In a study conducted by the Central Michigan University, having dogs present in the workplace was found to help improve communication and cooperation. For the study, participants were broken into small groups and given random tasks, some with a dog present in the room and others without. Regardless of performance or the given task, groups with a dog present were found to be more cooperative and trusting of their group members and appeared to be more comfortable with each other. Steve Colarelli, lead author of the study, calls dogs a "social lubricant" — they provide an easy way to break the ice and give us something to bond over.

3. Reduces Stress

April is ‘Stress Awareness Month’, but that doesn’t mean dealing with stress is only important one month out of the year. A study by Virginia Commonwealth University revealed that, while there was little difference in baseline stress levels, over the course of a day those that brought their dogs to work had reduced stress, as opposed to increased stress felt by those without dogs. Though the reason behind this is unclear, some theories suggest that dogs can help the body produce oxytocin, a hormone that relieves stress, while also lowering the body’s levels of cortisol, a hormone that is produced by stress.

In addition, the Mayo Clinic says that laughter is a great way to relieve stress and reduce tension. If you’re a dog owner or just enjoy the company of a furry friend, you know that — simply put — dogs are silly and can be the source of many laughs throughout the day.

4. Drives Productivity and Creativity

With improved communication and less stress, it’s no surprise that dogs can boost productivity and creativity. The Journal of Research in Personality conducted a study in which participants came up with goals and measured how they felt about attaining them. Some participants had their pet near, some thought about their pet during the experiment, and others were part of a control group. The result? The first two groups did better in not only listing their goals, but in having the confidence required when it came to achieving them.

Productivity is greatly impacted by the amount of time you take to step away from your desk. Spending too much time focused on a project can inhibit your ability to get work done, but taking small breaks has been shown to stimulate the brain and increase productivity. By bringing a dog to work, you’re guaranteed — by nature — to go on walks and step out of the office during the day, which Stanford University says can improve creative inspiration.

5. Attracts Talent

79% of Human Resource decision makers surveyed by Banfield Pet Hospital said pet-friendly workplace policies were often proactively discussed as a recruitment tool. Additionally, 65% said they were often asked about pet policies during the interview process. The flexibility of a pet-friendly office gives recruiters a competitive edge in attracting top talent, especially among millennials. And with Millennials overtaking baby boomers in pet ownership and "pet parenting" on the rise, it's no surprise that the comfort of having one's dog close by during work hours is a huge selling point when it comes to job hunting.

Four-legged Lookers

All that considered, it is with great enthusiasm that I’d like to introduce you to some of our office pups here at Looker!

Monty is a Field Spaniel which is (sadly) a rare breed. Monty's favourite time of year is Autumn because there is an abundance of leaves and sticks to chase and play with. He loves making new friends — whether they like it or not — and has been told on multiple occasions that he is the happiest and most handsome dog in London! Three of Monty's siblings are competing at Crufts this year, however, Monty was unable to attend due to work commitments at Looker. Monty loves cuddling up to a David Attenborough documentary (penguins are a favourite) and long walks along the beach.

Cash is a lab/husky/pit mix — aka, a totally wonderful mutt. His happy place is when he’s either running at full speed directly into the ocean or being completely passed out on the couch. He knows only two speeds really, 0 or 1000mph. His signature move when in play mode is a 360* spinning kick, which has been known to send many an unexpecting victim flying. He loves belly rubs, beef jerky and chasing cats whilst being simultaneously outsmarted every single time.

Chewie is a Border Collie with a dash of Wookie. Although he's now a beach boy, Chewie is very country at heart, being born on a Calaveras County farm. His love for open spaces hasn’t faded away, as he's grown to enjoy hikes, long walks on the beach, and most importantly, running as fast as possible after tennis balls. He enjoys submitting to any and every dog he meets followed by a playful game of "catch me if you can!"

Atlas is a Boston Terrier puppy whose mother was rescued from the Paradise, CA wildfire. He swapped his rural west-coast life for the big city as soon as he was old enough. Atlas lives in NYC where he regularly turns heads and melts hearts as he prances down the sidewalk. He loves getting to work and play with all the other dogs at Looker. He starts the day with a lap of the office, stopping by all the desks of all the Lookers who keep treats on their desks (he knows who the softies are), followed by a nap on his favorite conference room chair.

First things first, Walter is not a dog; he is a 90 year old man trapped in a quadrupedal body with the swagger and the muscular glutes of a 20-something frat star. Don't let his chill demeanor fool you though, the minute he sees a ball he's the fastest dog in the park — just don't expect to get it back without a fight and a fair amount of slobber. He's also become an expert at tricking his funcle into taking him to the dog park at all hours of the day and night. Walter is the best snuggler known to man and fully intends on sleeping under the covers with his head on the pillow like the human that he is.

Calais is a 3.5 year old Golden Retriever and Lab mix (mostly golden). Calais was originally meant to be a service dog for Canine Companions for Independence, or CCI. That didn't work out, so she moved to the big city and started her career in tech. She is the Chief Pawperating Officer of the Looker San Francisco office and loves coming to work because everyone gives her scratches and her dad gives her a lot of treats for being a good girl. She has 7 brothers but no sisters!

Learn More About Our Looker Culture

Take a look inside our culture here or follow us on social media at @lookerdata.

<![CDATA[Women of Data: Diana Streche, Data Engineer at Adore Me]]> http://looker.com/blog/women-of-data-diana-streche http://looker.com/blog/women-of-data-diana-streche Diana Streche is a Data Engineer at women’s intimates company Adore Me. Since graduating with a degree in Computer Science from the University of Transilvania, Diana has worked in roles including on the business intelligence team for a mobile gaming company.

Now at Adore Me, she shares her love for data by enabling users across levels of technical expertise to find success with data and build on their company data culture.

Hello Diana! Can you share with us how your background lead you to a career in data?

I initially came from a purely technical background, where working with data was more of a hobby that I happened to have on the side. When I started looking for work, I managed to get into a big mobile gaming company working on the business intelligence team. I had a lot to learn from scratch as far as BI was concerned, but in the meantime I discovered that I liked working with data way more than I liked producing software. With all the opportunities out there in the data and business intelligence space, I decided I would continue and build on those skills, which has lead me to where I am today.

What has been the biggest surprise in your career?

As an artist and creative at heart, I initially didn’t want to work in tech at all. When I started, I had absolutely no knowledge of what BI meant, how it worked, and my technical skills definitely did not match the job description. But with time, I discovered that there was more to business intelligence than met the eye. I ended up surprising myself when I learned that I had more skills to bring to the table through my creative side than I’d first thought.

What advice would you give to other women who are interested in pursuing a similar career path to yours?

Everyone has a niche to fill, so find what you’re good at and make it work for you. Being the lone female in my line of work, I often felt the need to “compete” with the guys and show that I was just as good at doing what everyone else was. I ended up deciding that instead of competing with the masses, I would continue to develop the skills I was good at and enjoyed, and would make those skills useful for everyone else. In doing so, I was able to be useful in areas that no one else was tackling and be appreciated and given new opportunities because of it.

What can women in the workplace do to help build the foundation for successful careers?

A personal mantra of mine is ‘always keep learning’. If the work you’re doing isn’t pushing you to learn and do more, consider making a change. Take on learning something new or find challenges and skills you can develop to help fill gaps where you are now. There is always something to do and room to grow.

"Everyone has a niche to fill, so find what you’re good at and make it work for you."

How do you think individuals can use data to advance their ideas or careers?

Data is very powerful — it permeates almost of everything that we do, sometimes without even realizing it. It can come in the form of KPIs that help push your business in the right direction, or operational data that tells you the when, what, and where of your applications, warehouses, and traffic.

What I think is most valuable about data is that it’s a communication bridge between people across levels of a company. Data bridges differences in experience and backgrounds, even within teams, and ensures that everyone has a say. You may not understand the internal mechanics of other teams and departments, but you can understand the data, and that can help you drive your point across.

Do you think that data can help build a more diverse and equal workplace?

I think it already is! By getting people from all corners of the company to understand and interact with the same data, we are already giving people a say in decisions that they may not have gotten to participate in before. Only a limited group of people can challenge an opinion, but anyone can challenge the numbers. Through that, I think we’re opening up a completely new world.

<![CDATA[Data of Thrones: Game of Thrones Gender Analysis for Death, Sex, & Dialogue]]> http://looker.com/blog/data-of-thrones-v-gender-analysis-game-of-thrones http://looker.com/blog/data-of-thrones-v-gender-analysis-game-of-thrones Since the beginning of the series, much has already been noted about the “woman problem” in Game of Thrones, from how the show may lean into gender stereotypes, to killing off female characters in one fell swoop (or season).

Thanks to our analysis on GoT screentime, we were able to validate, through data, that GoT women got less screentime than their male counterparts. And even though average screentime for female characters is higher than for male characters, it was because there are fewer speaking and leading roles for women than men.

With female characters poised to steal the show in GoT’s final season — especially considering the most recent episodes — we decided to revisit our gender analysis with the added context of scene-level and script data we now have in #DataOfThrones.

Read on to learn what we found in our gender analysis dashboard of GoT script and scene-level data, but be warned: SPOILERS AHEAD!

Game of Thrones script analysis by gender

To even the playing field between male and female characters, we took the top 15 speakers of each gender based on the number of lines each spoke. The result?

  • 30 characters
  • 155,407 words spoken
  • 10,671 lines
  • 3,181 screentime

Even though we already knew men get more screentime than women, the results still surprised us. When looking at the top 30 speakers by line count in Game of Thrones scripts, men speak 29% more lines than women.

But even more surprising, is that dead male characters have more lines than female characters who are still alive. By the end of season 6, Ned Stark still has more lines attributed to him than Yara both Ygritte combined.

No matter how we sliced it, male characters consistently had more words than their female counterparts. They’ve spoken 0.7 more words per line, 2 more words per scene, and 4.3 more words per minute than female characters.

Game of Thrones words and dialogue breakdown for male and female characters

So what were all those words that men were saying but women weren’t? And vice versa?

When we analyzed the words more likely spoken by a specific gender, we found some curious results.

Male characters tended to have more masculine-centered words in the analysis, like “men,” “man,” “king,” and “lord.” However, female speakers were likely to use proper nouns like “loras” and “joffrey,” which lines up closely with another popular word for female speakers: “husband.”

While we found “men,” “time,” and “come” in our male words analysis, the leading female words were “leave” and “please.” And it was a quite a delight to discover that female speakers used the words “stupid,” “liar,” and “oysters” more than men.

Our favorite find here? The starkest (pun intended) difference is the appearance of “arya” in the female word list. We hope it’s a foreshadowing of what’s to come.

Death, sex, & gender in Game of Thrones

Even if you don’t watch the show or read the books, you might have heard how GoT is a very sexy and violent show. We personally have still not gotten over the death of a certain dad in season 1.

Thanks to our scene-level data, we can now compare deaths to sex, and what kind of sex these characters are having. So what did the data tell us?

Turns out how you die inGame of Thrones is (kind of) gendered for the top 30 speakers. Male characters tend to die a more bloody death (so much stabbing!), while female characters tend to die in more creative ways like poison, wildfire, and Moon Door.

The “type of sex” data revealed just how medieval-inspired the story and the Game of Thrones world is. Not only were female characters more likely to be involved in romantic (love) sex, but they were also more likely to be in sex scenes that are categorized as rape. Male characters, on the other hand, are more likely to have paid sex, as well as fun and “served” sex. We’ll leave it to you to figure out who is having which type of sex.

Interestingly find in this data — the data only associates “incest” with a female character, even though we know it takes (at least) two to tango.

Explore Game of Thrones gender analysis

Want to dive in yourselves? Start exploring our Game of Thrones Gender Analysis dashboard. We can’t wait to see what you find!

A few notes on the analysis

  • We isolated 15 female and 15 male characters from Game of Thrones to only include those with the highest count of lines in the script data.
  • We excluded any speakers in the script data that we could not properly identify as a character.
    • For example, the speaker “HIGH” could be High Sparrow, High Septon, or High Priestess, so “HIGH” was excluded from the analysis
    • We used the label “Gender” instead of “Sex” for this analysis for male & female characters so it’s not confused with the act of sex in this analysis
  • Characters included in the analysis
Female characters Male characters
Cersei Lannister Tyrion Lannister
Daenerys Targaryen Jon Snow
Sansa Stark Jaime Lannister
Arya Stark Petyr Baelish
Brienne of Tarth Samwell Tarly
Catelyn Stark Davos Seaworth
Margaery Tyrell Theon Greyjoy
Melisandre Lord Varys
Olenna Tyrell Bronn
Missandei Jorah Mormont
Yara Greyjoy Tywin Lannister
Ygritte Eddard Stark
Shae Stannis Baratheon
Gilly Robb Stark
Lysa Arryn Sandor Clegane

Disclaimer: Game of Thrones belongs to HBO and is not affiliated with Looker in any way.

<![CDATA[The Podium - April 18th, 2019]]> http://looker.com/blog/the-podium-april-18 http://looker.com/blog/the-podium-april-18 Hey, Good Lookers!

I’m freshly back from JOIN London and still feeling the buzz from meeting so many awesome Looker users. While I’ve been gone, there’s been lots of great activity on the Community that I’m eager to share with you all. And bonus note — Office Hours is back next week! If you’re in San Francisco or New York City, swing by our offices on April 25th at 3:30pm local time to do some spring cleaning of your Looker instance! Sign up here for the event.

PS: It’s Game of Thrones season, which means it’s also #DataofThrones season! Check out our interactive Game of Thrones data and make sure to share your findings with the Community here.

With that, let’s hit it! Scroll down to read the latest edition of The Podium.

Izzy Miller
Community Manager, Looker

Questions and Answers

Displaying h:mm correctly across 24 hours

DGS wanted to add two timestamps together but kept getting results under 24:00 due to formatting. Paul_Wadsworth jumped in with the magic value format key that fixed it.
Looker Community Discussion — Displaying h:mm correctly across 24 hours

Making default parameter values dependent on Liquid logic

This is a tough one! Menashe wanted to dynamically set the default value of a parameter. He got a few suggestions from bens and IanT, ranging from the simple to the “wacky”. Read on —
Looker Community Discussion — Making default parameter values dependent on Liquid logic

Changing the color of one column in a chart

Jmay wanted to call out one specific column in a chart with a single color. She figured out a clever way to do it but was missing the last step, which powellandy shared. This is a neat trick that I use quite a bit myself — check it out here.
Looker Community Discussion — Changing the color of one column in a chart

Community Spotlight — Haley Baldwin

Our guest this week is @haleyb. Welcome, Haley!

Where are you from and where do you call home now?

I’m from Northern Virginia and still call it home!

What do you do with Looker?

I’m a Senior Data Visualization Consultant at Excella, currently using Looker to develop dashboards for one of our customers. We’re bringing together multiple data sources that were previously separate and time-consuming to search, giving our customer easy access to data, and making it easier for them to spot anomalies and possible fraud.

What do you do for fun (when you’re not using Looker)?

Baking and music! I love baking bread/desserts and I also drum for a cover band and a marching band in the DC area.

When you were a little kid, what did you want to be when you grew up?

As a kid, I was anxious about not having an answer to this question and arbitrarily picked photographer as my answer... so I’ll go with that.

What’s the coolest Looker project you’ve worked on?

I’m really enjoying the current project I’m on. It’s my second Looker-based project and I’ve learned a lot about setting up explores and joins, creating derived tables, and using HTML to customize our dashboards. Beyond that, my team is also doing some cool data science work with text mining and network analysis, so it’s a fun challenge to figure out the best way to visualize that data in Looker, possibly using custom visuals.

If you could ask anyone at Looker a question, who would you pick and what would you ask?

Haley: To the product development team — are there plans to continue supporting and improving LookML dashboards? We only develop LookML dashboards for our customer, but it seems like more focus goes to the user-defined dashboards (easier to track usage, supported on the Overview page, easier to organize in Spaces, etc.)

Arielle, Product Manager at Looker: Our ultimate plan is that our users shouldn’t have to worry about the difference between UDD and LookML dashboards. We’d like to combine the two concepts where you can build a WYSIWYG dashboard, but still have the version-controlled LookML backend, resulting in one dashboard concept used across the product so that it does show up in Spaces, etc. We don’t have any specific timelines for it yet, but it’s something we are researching.

Izzy, butting in: You should also check out the sync_lookml_dashboard beta API endpoint, which lets you keep a User Defined Dashboard synced to a LookML dashboard.

Community Knowledge Share

The Community knowledge share is on the technical side this week! Below are some super cool projects that are too interesting to not highlight — take a look!

Ro's LookML Autogeneration System

As Ro scaled up their model, they took the time to build out yet another layer on top of LookML, and now they manually write “less than 10% of the number of lines in the LookML that is generated”. Really ingenious stuff, thanks for sharing Sami_Yabroudi!
Looker Community Discussion — Ro's LookML Autogeneration System

Automating Looker user feedback collection

Josh_Temple from Milk Bar created an awesome script for automatically gathering Looker feedback from his business users. The philosophy behind that idea brings tears of joy to my eyes, and the implementation is the cherry on top. Keep crushing it, Josh.
Looker Community Discussion — Automating Looker user feedback collection

Totally accurate Game of Thrones predictions based on data

Looker’s very own Tom_Yeager, who has never watched Game of Thrones, used our #DataofThrones explores to come up with some educated guesses about the series. This is a must-read.
Looker Community Discussion — Totally accurate Game of Thrones predictions based on data

Join the Conversation

That’s all, folks! Head over to the Community to post your thoughts or to say hey to Haley. Also, be sure to share your comments about any specific discussions and tips that have helped you out lately. And of course, send in your questions for the next Spotlight feature.

<![CDATA[An Act of Empathy: Why I Joined Looker]]> http://looker.com/blog/an-act-of-empathy-why-i-joined-looker http://looker.com/blog/an-act-of-empathy-why-i-joined-looker I'm sitting in the audience at Looker's JOIN: The Tour event in New York City, when the speaker on stage shares a quote from the company's co-founder, Lloyd Tabb:

"Great software is an act of empathy."

The quote, which invites us to listen to and understand the people using our products, is as profound as it is surprising. One expects founders of high-tech startups to make grand statements about growth, innovation, and disruption — all lofty goals, to be sure — but empathy is an act that reveals emotion and humility. These aren't qualities typically associated with ambitious leaders of hot technology companies.

But at Looker, a data & analytics platform company I joined this month, this kind of thinking is part of the culture, ingrained in the values taught to all Looker employees (referred to affectionately as "Lookers") during new hire orientation. Values like loving our customers (not just making them successful), welcoming and embracing diversity, teaching others, acting with integrity and authenticity, checking your ego at the door, and finding time to unplug and nurture the mind, body, and soul.

Now, if all this sounds too good to be true, I don't blame you. As a 20+ year veteran of the data & analytics industry, I'm as jaded as they come. But I've also been around long enough to know when I've found something special, and this place is special.

And culture isn’t the only quality that makes Looker special. What ultimately convinced me to join was realizing we share the same core beliefs when it comes to data & analytics.

All people are data people

We share the belief that business intelligence and analytics software has primarily been designed for data-savvy analysts. Historically, for the “data-driven workforce” — those of us who need data to do our jobs but whose job is not data — these products haven’t worked. If we want to help companies become data-driven, we must design software for everyone, not just data analysts.

Data-driven experiences vs. dashboards

We also agree that great data & analytics software should reach us through the applications we already use. Business professionals today don't live in reports and dashboards — we live in business applications like Salesforce, Google Analytics, and collaboration tools like Slack. We must deliver "data-driven experiences" that seamlessly blend into and enhance our existing business workflows.

Governance is not a dirty word

I firmly believe that a business-friendly user experience is not at odds with well-governed data. Vendors of desktop discovery tools trivialize the "single version of the truth" while their customers suffer from unreliable data and inconsistent KPIs. Agile semantic layers are essential to empower companies with trusted data for more confident decisions, and this is a conviction Looker also shares.

Data for good

Finally, the belief that data can and should be used for a greater purpose is something I am especially encouraged to find at Looker. I was delighted to learn about Looker for Good, a way of giving back to our communities by pledging 1% of our product and our employee time to charity, and by offering significantly discounted prices to non-profit organizations who purchase our software.

Looker is my title

I am thrilled to join a growing company guided by fundamentally principled values, with a unique vision of what data & analytics should be, and a brilliant and talented team of Lookers committed to making customers happy and successful.

I am thrilled to have this extraordinary opportunity to help build something truly unique. To be a part of a company creating great software designed with empathy that delivers data-driven experiences to fit with the way people work is an adventure I’m excited to embark on.

I am thrilled to be a Looker.

<![CDATA[Data of Thrones: Create Your Own Interactive Game of Thrones Data Visualizations]]> http://looker.com/blog/create-your-own-game-of-thrones-data-visualizations http://looker.com/blog/create-your-own-game-of-thrones-data-visualizations The final season premiere of Game of Thrones is just a few days away! With winter fast approaching, we’re more excited than ever to see what the Data of Thrones will show us — especially if we can discover a new twist or trend before it’s revealed to the world during the final season.

In the spirit of the Brotherhood of the Night’s Watch, we want to invite all of you dear readers to the Data of Thrones! Starting today, you can explore all of the Game of Thrones data we’ve been using to learn how much more screen time male characters get, how deadly each episode truly is, and make some wild predictions that turned out to be about 50% right.

So, what are you waiting for? Read on to learn more about the datasets we have available for exploration, or head over now to Data of Thrones on the Community to start exploring.

As you explore, post your findings on Community and share them with @LookerData on Twitter using #dataofthrones!

Game of Thrones script data and analysis

Sometimes, the pen can be mightier than the sword. See who is saying what the most, and who gets the most say by digging into the scripts of Game of Thrones.

Explore here

Scene level details for Game of Thrones

Explore the series from a scene level and find insights about scene length, when characters marry, or where the most deaths took place.

Explore here

Game of Thrones episode data

Explore the Game of Thrones episode data and find out more about the seasons and episodes that were the most murderous — or the most romantic.

Explore here

Game of Thrones character data

Curious to know who’s gotten the most screentime over the entire season or which house is the most bloodthirsty? Find out the answers to these questions and more by exploring the character Game of Thrones data.

Explore here

Winter is Here

Start exploring The Data of Thrones. We can’t wait to see what you find!

Disclaimer: Game of Thrones belongs to HBO and is not affiliated with Looker in any way.

<![CDATA[Building on the Platform: Custom Applications for Sales & Marketing]]> http://looker.com/blog/custom-applications-for-sales-and-marketing http://looker.com/blog/custom-applications-for-sales-and-marketing Today’s successful organizations are moving beyond BI tools to create and deliver custom data experiences. They’re starting new ventures, delivering for their customers, and building data cultures to support rapid growth. As these experiences have become a crucial factor of success, so has the expectation that users can get the data they need to make critical business decisions, regardless of technical skill or specialty.

Here at Looker, we're on a mission to empower people through the smarter use of data. To do this, we continue to build, test, and evolve to ensure we're supporting companies throughout their data journeys. Building on our platform, we’re breaking the mold of traditional BI tools to deliver operational insights and provide experiences that are unique to the needs of users. A relatively new way we’re doing this is through pre-built, use case specific applications.

Why We Built Applications

We love working with all types of companies to help them understand the best ways to leverage their data and empower their users. While we’ve seen thousands of custom metrics and unique technical stacks, we’ve also noticed many companies spend time defining and building some of the same, almost identical, types of analyses.

To help both our analysts and business users self serve and simplify these specific analytical needs, we codified these metrics into out-of-the-box data models based on what we’ve learned from these use cases. By presenting these analyses in this way, it becomes easier for more people to make business decisions backed by data.

That is why we are building applications on top of the Looker platform.

What Are Applications?

Applications are customized data experiences for specific use cases. From within a Looker deployment, applications make it easy to get your stack up and running with quick time-to-value, no matter your level of technical expertise. Built to simplify self-service analytics, they remove the upfront effort for users with a common need because they are plug-and-play, purpose-built, and designed for scale.


Different than implementing a whole new tool, applications are an extension of a Looker deployment. This enables user groups with specific data needs to set up and begin using applications — no technical skills or wait time required.


Applications help to quickly surface the most valuable, actionable insights for common-use case data. And just like what’s possible with Looker, you can add in your specific KPIs to customize metrics, reports, and functionality.

Designed for scale

Because they’re built on top of the Looker platform, applications can support growth while continuing to address complex, cross-functional questions, empowering continued data-discovery.

Newly Released — Sales Analytics Application

At our annual JOIN: The Tour London event, we announced our new Sales Analytics application. Customers who were previously using disperative sales analytics tools can now bring all their sales data together for a unified view in Looker.

Sales efficiency has one of the greatest impacts on an organizations’ bottom line, yet sales management, representatives, and even operations specialists are not always data-driven roles. Sales professionals are often skilled communicators, hard-working, creative thinkers — imagine the possible results if they were also data-driven?

With our new Sales Analytics application, we aim to deliver actionable data to members of the sales team to empower them to prioritize leads and deals as well as to improve pipeline visibility and reporting.

What does it do?

This application provides pre-built dashboards and reports that leverage the best practices we’ve learned from years of helping companies do sales analyses. Users have a complete view of sales data, without the limitations of trying to report directly within a CRM.

For sales managers, this means you can monitor quota attainment throughout the quarter to see how far under — or above — your team is on track to perform, allowing you to make adjustments before the end of the quarter. With the custom visualizations and reports, you can view your entire forecasted pipeline, identify what sources are leading to the newest bookings, and can provide accurate updates on recently closed customers and revenue.

In addition, sales representatives can leverage personal dashboards to track their individual contributions towards team goals such as lifetime bookings, new customers signed, and percent to quota. The leaderboard dashboard can be used to identify top performers throughout the quarter, surfacing opportunities to share best practices.

In addition to empowering sales teams with data, we’re excited to continue our work helping marketers get more bang for their budgets.

Looker for Digital Marketing Application

With digital marketing spending expected to increase from 44% to 54% in the next five years, it’s more important than ever to make decisions with data. To address this and help our customers succeed, we’re excited to announce that the Looker for Digital Marketing application is now out of beta and generally available.

Marketers no longer need to waste time and budget sorting through spreadsheets for stale data. With the Digital Marketing application, marketers have a single spot to go to better understand campaign performance, conversion rates, and ad spend.

This application comes with customizable reports for analyzing ad performance from sources such as Google, Facebook, Bing, Pinterest, and LinkedIn. It also provides suggested optimizations to help you identify and double-down on top performing ads, or lower and stop bids on poor performers — directly from within the Looker application.

With the ability to immediately analyze and act on metrics, digital marketers can focus in on the methods, mediums, and messages that contribute to the most successful campaigns — rather than trying to sort through spreadsheets.

Looker for Web Analytics Application

Our Looker for Web Analytics application is also now generally available for marketers aiming to optimize their website. For companies with multiple websites, this application equips your web analytics team with the necessary reporting functions and dashboards to stay on top of all your website metrics.

With the Web Analytics application, users can analyze and compare traffic for multiple domains, understand user-level detail, and optimize the entire web experience.

If you’re a web analyst or optimization specialist, using this application gives you the familiarity of what you know within a customized experience. Built with the reports you already rely on from Google Analytics 360, it allows you to perform cross-property analysis down to the user level. Drilling into these metrics within application gives you the ability to better understand more about your unique or most frequent visitors across domains, locations, and campaigns.

Learn More

Whether you’re already a customer or you’re brand new to Looker, we’d love to hear from you. Learn more about our Web and Digital Marketing applications or reach out to get more information regarding our Sales Analytics beta program.

<![CDATA[Data of Thrones Part IV: Were our Season 7 Game of Thrones Predictions Right?]]> http://looker.com/blog/data-of-thrones-part-iv http://looker.com/blog/data-of-thrones-part-iv If you’re a Game of Thrones fan, you’re likely as excited as we are about the upcoming eighth and final season of the series. There are so many questions we still have that we’re eager to (hopefully) get answered!

As we anxiously await the results of winter’s arrival, we decided to pull a Viserion and resurrect our predictions from 2017. With an updated Game of Thrones data set and model, we went back to find out how our data-based, totally unscientific predictions for Season 7 of GoT fared in the #dataofthrones.


Prediction 1: Game of Thrones: Season 7, Episode 7 will have a lot of deaths.

Result: We were WRONG.

In our previous analysis, we saw a death-related data trend and predicted to see that continue, specifically hypothesizing that episode 7 would be deadliest in season 7.

To our surprise, season 7 kicked the death trend we’d previously seen to the curb. According to the all-knowing #dataofthrones, the highest number of named character deaths in the penultimate season actually occurred in the episode before the season finale. And when you think about it, that makes sense, since season 7 was largely setting the stage for what’s to come in season 8.

With that knowledge paired with the data, the deadliest episode of the series is likely to come in the final season and will break the record high of 9 named character deaths in 1 episode, which occurred in Season 6, Episode 10.

Prediction 2: Episode 4 is going to have a surprise death

Result: We were WRONG.

Although mid-season episodes were historically pretty deadly, Season 7, Episode 4 had no named character deaths. Instead, Season 7 had consistent, surprising deaths, the majority of which occurred during Episode 6.

Prediction 3: This season will not be as deadly as the final one

Result: We are PROBABLY RIGHT.

In our previous analysis, we found a trend of deadly seasons followed by a less deadly season. Shown in the data, season 7 ended up having the least amount of deaths in the entire series to date, which leads us to believe Season 8 will not only have more deaths than Season 7 — it may very well be the deadliest of them all.

Prediction 4: Eddison Tollett is going to do... something

Result: We are PROBABLY RIGHT.

While Eddison Tollet remains a character who’s shown up in a surprisingly high number of episodes, he didn’t crop up as much as we expected in season 7.

But hold on! When including Season 7 in our analysis of total screen time throughout the entire series, our trusty Lord Commander of the Night’s Watch still holds a high screen time — higher than Podrick Payne, Hodor, and Gregor Clegane. That’s pretty significant, wouldn’t you say?

We’re still unsure as to what exactly he’s out to do, but considering everything that happened in season 7, we have a feeling it’ll have to do with the breach of the wall and the good of the realm — in other words, some very worthy screen time.

Bonus Insight:

When we mapped screen time over season as a trend, we noticed that Eddison’s screen time trend has diverged from Jon Snow’s and is now trending similarly to some other key characters: Bran and Meera.

In addition, Jon’s new screen time trend is now similar to that of The Night King. No surprise there, since we know the face-off between these two factions is already upon us and can be expected to reach its climax in Season 8.

Prediction 5: Will Jon Snow survive with Sansa by his side?

Result: We were PROBABLY WRONG.

We previously noted Jon and Sansa were the only two characters of the main character group whose screen time has increased consistently in recent seasons. However, our interpretation of that trend to mean their time on screen in season 7 would continue to increase as a part of their shared storylines was wrong.

While Sansa’s triumphant return to Winterfell allowed Jon Snow to have the highest amount of screen time within a season to date, it was likely because this allowed Jon to leave his childhood home to recruit support for his campaign north of The Wall.

More interestingly, screen time for nearly all of our major characters reached series peaks in Season 7 except for Sansa. As the action is moving from Winterfell to the rest of Westeros and heading to The North, it makes sense that those that remained in Winterfell saw less on-screen action than the rest of the characters.

Prediction 6: Varys’ big plot will be revealed

Result: We were MOSTLY RIGHT.

While a big plot reveal wasn’t quite what we got from Varys’ appearances in season 7, we learned some interesting things from Varys that lead us to believe we were mostly right about this prediction.

When Daenerys confronts him about his poor show of loyalty in the past, Varys responds, “You wish to know where my true loyalties lie? Not with any king or queen, but with the people.” It’s clear, now more than ever, that he has never been out to serve a single ruler, but rather chooses to serve the realm. He goes on to say, “I will dedicate myself to seeing you on the Iron Throne, because I choose you. Because I know the people have no better chance than you.”

So although a massive plot wasn’t necessarily revealed, we gained a bit more understanding into his rationale and motivations.

Going back to the data, we plotted some specific words Varys has spoken throughout the series against each other, to see if that’s actually the case. The percentage of times Varys mentions “Daenerys” (pink), “Cersei” (blue), and “throne” (yellow) are trending most closely to one another in Season 7 - perhaps hinting to the inevitable face-off between the Mother of Dragons and the former Queen Mother.

Prediction 7: This will be the end of Little Finger… if we see more of him

Result: We were RIGHT... but also kind of wrong.

We all collectively knew Littlefinger’s time was ticking; we just didn’t know with certainty when his end would come. In one of the most anticipated (though arguably least climactic) moments of season 7, Littlefinger was executed at the hands of Sansa and Arya Stark.

This confirms that our initial hypothesis was true: on-screen time is related to characters who will die. Littlefinger’s screen time for season 7 was at its 3rd highest for his character, right up until he was killed.

However, screen time has not shown to be the loudest signal for death. In fact, most major characters had upward trending screen time, regardless of the likelihood of their demise, shown here when we compare the screen time for Varys (alive) against Petyr (recently deceased).

Final Tally of Game of Thrones Season 7 Predictions

Right: 2.5; Wrong: 2.5

Half and half — not too bad! Some of our predictions were closer than others, while others left us still curious and wondering. Are there other trends in the data that exist and will ultimately contribute to the events in the final season?

Subscribe to our blog for more data stories, or share your Game of Thrones insights and predictions with @LookerData!

Disclaimer: Game of Thrones belongs to HBO and is not affiliated with Looker in any way.

<![CDATA[The Podium - April 1st, 2019]]> http://looker.com/blog/the-podium-april-1 http://looker.com/blog/the-podium-april-1 Hello Looker Community!

Sad news today — The data has spoken and since it looks like nobody’s reading these, this will be the last ever edition of The Podium :(

dimension: what_day_is_it? {
    type: string
    sql: CASE WHEN ${TABLE}.date = '2019-04-01' THEN 'APRIL FOOLS DAY'

Kidding! The data has spoken and it looks like it’s APRIL FOOLS DAY! Happy April, Good Lookers.

Let’s jump right into this issue of The Podium, our biweekly blog highlighting the best of the Looker Community. Needless to say, it’s here to stay!

Izzy Miller
Community Manager, Looker

Questions and Answers

Two WITHs in a PDT

menashe had a question about using PDTs with CTEs (all the acronyms) that reference other non-persisted derived tables. Conrad, our resident PDT genius, hopped in to confirm the correct answer.
Looker Community Discussion — Two WITHs

Write back data to Looker from R

dsml had an interesting question regarding piping data from R back into Looker. Click through for two answers from the community.
Looker Community Discussion — Write back data to Looker from R

Community Spotlight — Rich Eden

Our guest this week is @rich1000. Welcome, Rich!

What do you do and where do you do it?

I'm Head of Data at Zava, an online doctor based in London but operating throughout Europe — by many measures we're the biggest :)

What’s the last table calc you wrote? What does it do?

Oh, let me think, probably my favourite 'running total divided by column total' combination. Really useful to identify the site pages responsible for 80% of traffic and share with other people in the company. But the thing we really love in the data team is actually custom dimensions — they are a game-changer. Now you can quickly group things together and pivot a whole table by your new grouping to create amazing charts, and all without creating a LookML mess for things that might only be used a handful of times. Very cool stuff — you should move it out of beta as soon as possible!

Which body part would you least mind losing? Why?

My hair, oh too late, it's already gone :(

What’s the most fun dataset you’ve ever played with?

I used to be responsible for some of the government's migration statistics a long time ago. Perhaps not the exciting cutting edge of data science but it's amazing — and quite scary — to see people arguing about your analysis in national newspapers!

What Looker feature are you a total expert on? Could you share a secret tip/trick?

I'm not sure I'm pushing the boundaries with any particular feature, but I do like to keep the Looker model nice and simple, unsentimentally deleting anything that looks like clutter. My rule here is if you find the LookML confusing then it's probably a bad idea.

Do you have a nerdy data/SQL joke?

As I frequently tell the team the three hardest things in data analysis are naming things and off by one errors. (I really do tell them this, sometimes they are polite enough to laugh.)

What’s a question that you’d like to pose to other community members?

I've had all my questions answered already on Discourse but I'd love a way to actually review the Looker models built by other companies for inspiration. We're all sitting in isolated teams solving the same problems and, I suspect, coming up with radically different solutions. I'm sure there are some great ideas out there that I don't even think to ask about. Unknown unknowns if you will.

Community Knowledge Share

Filtering & Ordering Only the Visualization

This is technically a question and answer, but there was so much knowledge shared from so many members that I thought it was deserving to put in this category. It was a party with Dawid, Ian, Jonathan, Andy, and me (but I was wrong!) — all working towards the solution.
Looker Community Discussion — Filtering or Ordering Only the Visualization

Join the Conversation

That’s it for this edition! Head over to the Community to post your thoughts or to say hey to Rich and tell him about your Looker model! Also, be sure to share your comments about any specific discussions and tips that have helped you out lately. And of course, send in your questions for the next Spotlight feature.

<![CDATA[Top 4 Data Strategy Trends for 2019]]> http://looker.com/blog/top-four-data-strategy-trends-for-2019 http://looker.com/blog/top-four-data-strategy-trends-for-2019 Recent research from Gartner1 found that 91 percent of organizations across the globe have not yet reached a "transformational" level of maturity in data and analytics, despite it being a number one investment priority for CIOs in recent years.

Large enterprises are typically reliant on legacy software and systems alongside best-of-breed cloud applications. Gartner2 reports that on-premise deployments still dominate globally, ranging from 43 to 51 percent depending on the use-case, while hybrid environments make up between 26 and 32 percent. It is these hybrid environments — with their potential for data silos — that have held many organizations back and prevented them from benefiting from the wealth of information at their disposal.

With this challenge in mind, many enterprises have only scratched the surface when it comes to the role data can play in driving business development. However, as we go deeper into 2019, we will see many large organizations close the gap on smaller players and evolve in terms of how they’re harnessing data.

I anticipate that we’ll see four key pillars as part of this data strategy framework change:

1. The role of the chief data officer (CDO) will become more prominent

A recent Forrester3 study found that 51 percent of organizations had appointed a CDO, with another 18 percent planning to do so. Additionally, two-thirds of existing CDOs were appointed in just the past two years, indicating a significant rise in the requirement for data leaders.

With CIOs, CTOs and CSOs increasingly stretched in their roles as part of the wider digital transformation and security challenges, the CDO will be tasked with defining a business’ global data strategy. The increased importance placed upon this role within organizations is a clear indication that large businesses are placing more emphasis on data as the architect and enabler of change. Ensuring they have the right platform in place will be key to delivering on this vision.

2. Reducing data silos will become more pervasive

There are already concerted efforts to bridge organizational data silos, placing an onus on centralizing data in order to achieve the ‘Holy Grail’ of one source of truth across an organization. While this has long been — and will continue to be — an industry challenge, reducing the silos will enable companies to unlock access to insights, improve customer experiences, and maintain data protection regulatory compliance.

3. Firms will begin to monetize data more effectively

Leveraging the data they already analyze and/or manage, enterprises will unlock new opportunities for their existing businesses. This will enable them to develop new business models and drive new revenue streams.

Urban Airship, which helps some of the world’s best-known brands leverage mobile, has worked with Looker to empower even moderately data-savvy marketers to run queries, explore new ways to look at data, and answer tough questions. This has resulted in its clients being able to react to trends faster, experiment with new ways of reaching customers, and to quickly evaluate and improve on efforts — reducing the friction between insight and action.

4. Data strategies will prioritize GDPR compliance

Last year, IT leaders were rightly committed to ensuring their organizations were compliant with GDPR. The typical drivers behind this were around eliminating the risk of the significant fines, and reputational damage associated with being publicly called out by a regulator as non-compliant.

Now that more businesses are compliant with this legislation (or working towards it), leaders can look to use their more streamlined data-sets to improve processes and glean insights into their employees, customer base, and other key stakeholders in order to improve decision-making.

That said, the work isn’t done yet. Businesses must ensure they remain GDPR compliant in the way they work today and moving forward. This means the development of a long-term data governance and analysis strategy, in which analysts can still provide their organization with game-changing business insights, while maintaining compliance with the regulation.

The Modern Approach to Data-Led Business

Easier processes, a clearly defined data strategy, and new business openings — that is the modern approach to analytics a data-led business should consider. I predict we’ll see more of this as we move deeper into the year, and those that embrace such a culture will steal a march on their competitors, gather market share, and will unlock more growth opportunities.

1 Gartner Survey Analysis: Traditional Approaches Dominate Data and Analytics Initiatives, Nick Heudecker & Jim Hare, October 13 2017
2 Gartner Magic Quadrant for Analytics and Business Intelligence Platforms, Cindi Howson, James Richardson, Rita Sallam, Austin Kronz, February 11, 2019
3 Forrester: Insights-Driven Businesses Appoint Data Leadership, Jennifer Belissent, Ph.D., Gene Leganza, Elizabeth Cullen, Jun Lee, March 19, 2018

<![CDATA[Going Beyond Good Intentions: Ways To Advance Diversity & Inclusion In The Workplace]]> http://looker.com/blog/ways-to-advance-diversity-and-inclusion-in-workplace http://looker.com/blog/ways-to-advance-diversity-and-inclusion-in-workplace In the last three decades, organizations have started their journeys towards becoming more diverse and inclusive (D&I). Throughout these journeys, jargon such as diversity, inclusion, belonging, equity, and equality have shown up on company websites, in mission and vision statements, and in remarks made by CEOs or presidents during company meetings.

While this increase in vocal support and commitment to D&I is a step in the right direction, there is still room to improve upon the follow-up actions necessary for success. Good intentions are great, but alone they are rarely - if ever - enough to make a measurable impact.

The Data On Diversity In The Workplace

According to the Catalyst, while women make up about 45% of total employees in S&P 500 companies, they are only 11% of top earners and hold only 4.8% of CEO roles. And that number has been decreasing from a high of 6.4% female CEOs in 2017.

In the C-suite as a whole, racial diversity still has room to grow. As of 2018, the number of racially diverse CEOs in Fortune 500 companies consisted of only three African Americans and 11 Latinx/Hispanic members. Going further into the data of diverse leadership, the Bureau of Labor Statistics reported that in 2018, only 39.8% of management roles were held by women and 23.8% were held by people of color.

These statistics point to what is known as the glass ceiling, in which unrepresented groups such as women and people of color cannot seem to get past a certain point of advancement in their career.

Actionable Steps To Advance D&I In The Workplace

While there isn’t a one size fits all approach to tackling the issue of diversity and inclusion, there are key steps every organization should be engaging in when striving to match verbal commitment with measurable action.

1. Set Goals And Plan For How To Meet Them

In my experience, people need something tangible they can reach for. Simply saying your organization is going to hire more women, people of color, veterans, and other underrepresented identities won’t help reach the goal.

Leaders and advocates of DEI in the workplace should start with creating a visualization of what success looks like. From here, determine the breakdown of actionable steps that contribute to the ultimate goal. Every team, hiring manager, and leader in the organization should be expected to do what is in their locus of control to advance diversity and inclusion initiatives. Considering the necessary actions and holding everyone accountable for their part is key in making the goal into a reality.

2. Build Relationships And Add To The Pipeline

Many times, if an organization lacks certain diversity, it’s because those who are recruiting, hiring, and making referrals lack that type of diversity in their own networks. Because of this, organizations need to be proactive with reaching out and making the first move to build trusting relationships with underrepresented groups. Generations of systemic barriers have kept these groups from accessing certain industries and positions, which has continued to impact their representation in the workplace today. To push past this and add to the pipeline, organizations must earn the trust of these groups and show potential employees that their experiences and identities are valued with actions, not just words.

3. Develop An External D&I Footprint

When looking for employment opportunities, many underrepresented populations look for organizations that they feel are in alignment with their own beliefs - or at the very least - make space for multiple beliefs and lived experiences to exist at the same time. This is why an external D&I footprint is so important.

An external D&I footprint is how a company talks about and creates spaces for D&I issues to be discussed within the industry. Whether this is done through blog posts, events, marketing campaigns, or company-wide memos, prospective employees from underrepresented backgrounds want to know that organizations have an opinion and are willing to express that opinion publicly.

4. Host Networking Events In Your Office

Opening up your office for events is a great way to help prospective employees visualize working for your company. Events give hiring managers the chance to meet with potential candidates in person and learn more about them beyond what is presented on a resume or cover letter. In addition, networking events provide a great opportunity to introduce these individuals to other people in the organization who could end up being their colleagues, building trust and relationships that emails or phone calls may not have been able to create.

Find Success Through Action

By taking actions that affect the measurable successes of D&I initiatives, advancements with diversity, equity, and inclusion in the workplace won’t only be heard, but will be seen and reflected in the data.

If you want to learn more about the actions we’re taking at Looker, check out our dedicated DEI page, Belonging, for more information about diversity, equity, and inclusion at Looker.

<![CDATA[Deepening Code Reusability with LookML Project Import]]> http://looker.com/blog/deepening-code-reusability-with-lookml-project-import http://looker.com/blog/deepening-code-reusability-with-lookml-project-import Each month, Looker releases new updates and features that further enable the smarter use of data. As we continue to improve and build upon Looker, we want to highlight and share notable features so that our customers can take full advantage of them.

With Looker 6.8 come many great additions focused on modeling. Most notably, we are releasing beta support for importing projects from private, remote LookML repositories.

To understand why this is a big deal, let’s first review why LookML is so valuable to data modeling.

D.R.Y. Data Modeling with LookML

One of the core benefits of LookML is that it stops you from repeating yourself when doing data analysis.

When writing SQL, you often look back at queries you’ve written in the past, copying and pasting little snippets of those queries, and reassembling them to form a new query. This process is error-prone and doesn’t scale well. With LookML, you write down those snippets just once into universal and reusable query components that are easier to manage. In computer science, this concept is called D.R.Y., or Don’t Repeat Yourself, and it can save you a lot of time (and tears).

Once the model is written, Looker is able to read it and present it in a visual, point-and-click interface that’s accessible to non-coders. As the user builds their query, Looker does the tiresome work of assembling optimized SQL code automatically behind the scenes.

The mantra is to write the code ahead of time, not every time.

The Issue: Sharing Code Between Projects

LookML models are organized as projects. Projects are the highest organizational-level containers for LookML, and until recently it was not possible to share code between them. This inability to easily share code between projects can create issues in common use cases where different departments would benefit from being able to share the same code.

For example, say the finance department has a project that describes and calculates net revenue. The marketing department might also want to leverage this measure to determine the profitability of marketing campaigns. If they happen to be using different LookML projects, the only way to reuse the code would be to copy and paste it, which would take time and be subject to errors - not D.R.Y.!

The Solution

Fortunately, problems like these have already been solved in the greater world of programming languages. The solution is called package management, and it is a feature of almost all mature programming languages.

A package is a unit of code (or software) that can be reused by another program. The package manager is the system that allows packages to be found, installed, configured, updated, and removed. With a package manager, the developer doesn’t have to do these tasks manually, thereby eliminating work and improving code consistency.

To apply these principles to LookML, we built project import. Project import allows the developer to reuse code from another project without having to copy and paste it, thus deepening our commitment to D.R.Y. coding and saving LookML developers valuable time. We first introduced project import as a beta over a year ago, and have since made many iterative improvements

Project Import for LookML Blocks

The first, most obvious benefit of project import is in using Looker Blocks. Looker Blocks are reusable units of LookML code based on common data sets and use cases. For example, we have a Salesforce Block that quickly adapts the notoriously tricky Salesforce schema and models it into dimensions and measures that everyone can understand. For anyone who has tried analyzing raw Salesforce data or has attempted to use its API, this is pure magic.

Project import improves the Blocks experience by allowing the developer to simply reference the URL of the Block’s source code and then immediately use that code inside their project. Additionally, by not hard-coding the Block’s contents, developers can opt to update the installation of the Block very easily, so as the Block continues to get better, so does the developer’s model.

Project Import for the Enterprise: The “Hub and Spokes” Architecture

In large enterprises, the deployment of Looker can grow to a size where the core data team can no longer manage every department’s data needs, and some level of delegation must happen. Splitting up projects so that each department can manage their own becomes advantageous, but how do you maintain consistency?

In this case, you can use project import to push centralized, highly governed business logic down into each department’s project. We call this the “hub and spokes” architecture.

In the hub and spokes architecture, a central data team manages a core project (the “hub”), while each department manages its own projects (the “spokes”). The hub contains universal business logic, whereas the spokes contain domain-specific logic. With the latest release of Looker 6.8, hub logic is imported into the spokes via project import.

With the hub and spokes architecture and project imports, you get the best of both worlds. Hub and spokes allows everyone to run at their own pace with distributed, scalable modeling ownership, while project import ensures everyone is still speaking the same language of shared logic, derived from organization experts.

Learn More

Ready to learn more and get started with project import? Check out our documentation, join the conversation about Looker 6.8 on the Community, and subscribe to our blog for future updates highlighting features coming with new releases.

<![CDATA[The Podium - March 20th, 2019]]> http://looker.com/blog/the-podium-march-20 http://looker.com/blog/the-podium-march-20 Hey Looker Community!

I’ve missed you all in the time since the Podium. I was so glad to see smiling faces from the Community in person at JOIN: The Tour, Los Angeles. There’s something particularly great about putting faces to usernames, and I look forward to meeting more of you at future events to come.

Speaking of events to come, tomorrow — Thursday, March 21st — Looker is hosting an Office Hours event focused on ‘Building Impactful Dashboards’ in both our New York and San Francisco offices. From 3:30pm-5:00pm local time, there will be presentations, snacks, and great company and conversation. Learn more and sign up here, and if you go to the San Francisco event, I’ll see you there!

With that, let’s dig in and see what’s new in the Community with this week’s edition of The Podium.

Izzy Miller
Community Manager, Looker

Questions and Answers

Running total of distinct values

AndrewDoesData was working to create a running total of distinct values, using table calculations. Our beloved Fabio, also known as LookerIO, jumped in to provide a table calc method for him, as well as a derived table approach.
Looker Community Discussion - Running Total of Distinct Values

Columns stacking together

Paul_Wadsworth was having some trouble getting a waterfall chart to display correctly. Bens jumped in and provided some wisdom that illuminated the pathway to the solution.
Looker Community Discussion - Columns Stacking Together in Chart

Community Spotlight - Tim Nebesar

Our guest this week is @tnebesar. Welcome, Tim!

What do you do and where do you do it?

I’m on an analytics team for a large global retail company. We try to make our data understandable and useful.

How does Looker fit into your day-to-day?

I’ve been using Looker a lot recently. We have a lot of people very eager to get into the weeds of our data who don’t necessarily have the technical skills to query it themselves. Looker is great for democratizing and standardizing data. The more that we can build and socialize in Looker, the fewer questions and requests for ad hoc queries/analyses we get so it’s a win/win.

What’s your secret superpower?

Hmm I don’t think I have a superpower — more of a jack of all trades. I do make a nice leather sunglasses strap among other hand-crafted goods though when not wrangling data. Check out timsxtim.etsy.com and use LOOKER15 for 15% off (shameless plug).

If you could open a restaurant, what would its gimmick be?

I’ve actually had this idea for a while. My go-to breakfast is scrambled eggs mixed with basically whatever I have in my fridge, eg cheese, peppers, onions, sausage, fresh salsa and sour cream. I’d call my restaurant “Dirty Eggs,” and it would be a fast casual restaurant where people can choose different ingredients to add to their scrambled eggs (no omelettes allowed). I have no interest in owning a restaurant though so someone else should do this.

What’s your favorite word to use to describe data? Mine is “overcooked”.

Clean…it’s too rare.

Community Knowledge Share

Hiding all fields in legend

nicholaswongsg shared a super efficient way to deselect every label from the chart legend. I actually didn’t know about this shortcut, so thanks Nicholas for sharing your knowledge with the Community — I learned something too!
Looker Community Discussion - Hiding All Fields in Legend

Simple month-to-date forecasting

Looker’s own Marcus_O_Hanlon came up with an elegant table calculation to forecast a KPI on a month-to-date basis. After nmorrison’s initial refinement to calculate year-over-year, Marcus_O_Hanlon came back with an awesome reusable LookML pattern to do this.
Looker Community Discussion - Simple month-to-date Forecasting

Powered By Looker flowchart

I’ll let fabio’s words speak for themselves:
With the huge amount of flexibility offered by Looker in terms of serving analytics to your customers, it can often be daunting finding the right documentation that applies to your situation! That’s why I’ve put together this flowchart to guide you.” This is awesome stuff, Fabio!
Looker Community Discussion - Flowchart for configuring Looker to your PBL tenancy situation

Join the Conversation

Head on over to Discourse to join the conversation and post your thoughts about this week’s edition. Share some kudos with Tim and ask him more about sunglass straps, eggs, or clean data. Also, be sure to share your comments to grow about any specific discussions and tips that have helped you out lately. And of course, be sure send in your questions for the next Spotlight feature.