Hi there! You're receiving this email because you signed up to Behavioral Insights, a weekly newsletter by Kevin Anderson at the intersection of experimentation, analytics and behavioral science. Thank you for being here.

Behavioral Insights.

👋 Hi, in this week's Behavioral Insights newsletter 9 articles I found worth sharing:
1. Tools for Optimizing Digital Experiences 🙏🏻
With the launch of Amplitude Experiment we have yet another vendor extending their offering in the space of ‘optimizing digital experiences’. I wrote a short post on what I think/hope this trend will mean for different teams within companies. Read my blogpost

2. The Crisis of Optimisation 🔥
Good article by Oliver Palmer: "If the transparent, peer reviewed world of academia is fraught with this crisis of integrity, then what can we expect from the vastly less scrutinized corporate ‘science’ of A/B testing?" Read the article

3. Snowball Effect ❄️
Why experimentation becomes a part of the DNA at some organizations but remains an occasional tactic at others? You need to get complex contagion to work. Read the article

4. Increasing experimentation accuracy and speed by using control variates 💘
Etsy recently introduced the CUPED method to estimate treatment effects in A/B tests more accurately. In this post they share some lessons on how they did it and what the results are. Read the article

5. Are you Bayesian or Frequentist? 🆚
Cassie Kozyrkov explains the differences between bayesian and frequentist statistics, and when you should adopt which method. Read the article

6. How eBay Reimagined Its Analytics Landscape 👀
Learn how eBay transitioned its analytics data platform from a vendor-based data warehouse to an open-source-based solution built by the team. Read the article

8. Split is looking for a Experimentation Advocate 💬
Check out the job profile

9. Minerva — Airbnb’s Metric Platform 📈
In an earlier newsletter I shared the first blogpost on Airbnb's metric platform. In the second post, the Airbnb data team shares the design principles behind Minerva: 
  • Standardized: Data is defined unambiguously in a single place. Anyone can look up definitions without confusion.
  • Declarative: Users define the “what” and not the “how”. The processes by which the metrics are calculated, stored, or served are entirely abstracted away from end users.
  • Scalable: Minerva must be both computationally and operationally scalable.
  • Consistent: Data is always consistent. If definition or business logic is changed, backfills occur automatically and data remains up-to-date.
  • Highly available: Existing datasets are replaced by new datasets with zero downtime and minimal interruption to data consumption.
  • Well tested: Users can prototype and validate their changes extensively well before they are merged into production.
Read the article

😀 Fun of the week
Imagine you are in a room with 100 strangers, Imagine they're similar to your peers and neighbours. Based solely on your own instinct, perceptions, and self-reflection, answer the questions on thanaverage.xyz

As always, if you're enjoying this newsletter, I'd love it if you shared it with a friend or two. You can send them here to sign up.

I try to make it one of the best emails you get each week, and I hope you're enjoying it.

And should you come across anything interesting this week, send it my way! I love finding new things to read through members of this newsletter.

Until next week, keep experimenting.
 Kevin
 
 
 
 
 
 

Email Marketing Powered by Mailchimp