
Priceonomics Idea Tester - bemmu
https://priceonomics.com/priceonomics-idea-tester/
======
Kluny
So you make a clickbait title with a snippet of text and see if people click?
And then if they click, they get... what? Clearly not the article, since you
haven't written it yet. This sounds like a good way to irritate people. So
then you write the article that got the most clicks and present it to the same
target audience, who scratches their head and thinks, "Hmm... didn't I click
on a link about that story last month? And wasn't it a scam of some kind?"

Trust is hard to earn and instantly lost. Does priceonomics have some idea of
how to prevent that?

~~~
lliiffee
They prevent that by hosting it somewhere that no one can find it's associated
with you. Not saying that's ethical, but it does prevent immediate harm to
your brand, unless people somehow remembered on clicking the link when they
see the actual article. (Which would still be a tiny fraction.)

~~~
Kluny
I don't think that's enough, to be honest. I mean, people in various
demographics all kind of get their news from the same places - reddit (various
subreddits), the verge, vox, breitbart, fox, whatever your flavor is, your
target audience can be found there on the regular. I'd be hesitant to test an
article idea on, for example, cracked.com, then publish the real thing on
/r/pcmasterrace. If I'm writing for that audience, I'd expect them to be in
both places and I'd expect them to remember.

~~~
savanaly
I dunno man, the internet is a pretty big place. If you're hitting the level
of popularity with this stuff that of the people who see your first fake news
article, a significant portion of them then see it when it's published in an
unrelated place the next month then this whole discussion might be moot.

------
birken
This is a good example of what _not_ to do as a data scientist: draw confident
conclusions when there is not strong evidence to support that in the data.

First of all, an R-squared of 0.15 is nothing to write home about. That is
telling you that a massive amount of the variance in the data is not explained
by whatever field(s) you are looking at. But hey, 0.15 is something! That
means it has "some predictive power"! Maybe, maybe not.

First of all, saying you've founding something with "some predictive power" is
a cop out. Your human editors and writers have a lot of "predictive power"
themselves just based on their experience. The real question is if the
predictive power of this CTR test is a useful compliment to their judgment,
and/or if it is worth all the time/effort you put into it.

So let's look at the data little more deeply. A good little trick of analyzing
data is to throw out the outliers and then see if that changes your
conclusions. This needs to be done with care (sometimes outliers are
important!), but when analyzing data you want to play with it a little bit to
get a feel for it. In this dataset in particular, what if I throw out the 2
highest pageview stories and the 2 lowest pageview stories? Well the whole
conclusion collapses, because the 2 highest pageview stories are basically the
only 2 data points supporting this conclusion in the entire dataset. Well
crap, that isn't good. At this point I'd stop being confident at all because
if taking away a couple outliers destroys your conclusion, that is a bad sign.
That doesn't mean the CTR tool is worthless, it just means you can't make the
confident claim about it's usefulness from the data.

But let's keep going anyways, the idea seems interesting. Here is how to test
it. Give access to this tool to half of your writers and not the other half.
Let them each write a bunch of stories and compare the page-views of the two
sets. Since you can't make this test completely randomized, you are going to
have to control for other factors which might be important (a writer's skill
level, past experience, etc), but it should be do-able. Get a bunch of data
points, plug them into a model and see if the tool had any impact. At that
point you should be ready to write a blog post about it.

------
amelius
TL;DR:

> Idea Tester works by purchasing a $10 ad for each concept and measuring the
> click-through rate on the ads. $10 isn’t always enough to give you
> statistically significant result, but frequently it gives you an insight
> that helps you avoid writing articles no one cares about.

------
rubidium
"articles that had high click-through rates tended to do well"

Wait, that's not what their data showed at all. There was a TINY "correlation"
between CTR and total page views. And with the sample size of 30 across this
distribution is hard to say anything with statistical significance.

------
mdorazio
This was not what I was expecting. They're just using a small ad spend to
measure CTR on the proposed article headlines/stubs instead of using some kind
of ML to predict success like I originally thought. The approach is also not
very cheap. At $40 in ad spend and at least a day to get the results, you
could have just paid someone to write one or more blogspam articles, which is
what this will optimize for.

~~~
brianwawok
And where does the click go? A page "Thanks for voting on this title!"

If you do that for a few days, people are going to stop clicking on your
clickbait titles with no content.

I totally get rotating headlines to see which gets more clicks. I do not get
rotating titles before the article is written.. that is going to make users
quit clicking on your ads very quick ;)

~~~
whatshisface
How will they know which adds are yours? The CMS is off-brand and priceonomics
says they will host it.

If this tragedy of the commons really kicks in, you can bet advertisers will
start calling for FCC intervention...

------
qwrusz
_I 'd love to know if Priceonomics tested the idea for this "Idea Tester"
product of theirs before launching it?_

This type of "customer research" trickery is not new. And I realize
clickbait/fakebait is big busine$$ and many companies just don't have qualms
about it or care how much online readers despise it.

I'm surprised Priceonomics decided this was a good idea to pursue and launch
like this #1 This type of idea testing can't reliably offer meaningful
insights to be useful and sometimes is so incorrect it is harmful.

#2 Many of Priceonomics potential customers have the integrity not to use
these methods. Worse, there are potential customers (especially bigger
corporate customers who have headline risk) who won't look at any other
services offered nor want to be associated with a firm if this is part of what
they do.

#3 The timing to publicly launch a product associated with clickbait or fake
news _could not have been worse!_...With the extra negative attention this is
getting and FB and Googi finally changing their policies toward it.

There are methods for legitimate ideas testing and huge demand for these
services, why not offer that?

------
sporkologist
> You could invest hours, days, or years in a writing project, only to find
> out that you’ve picked a bad idea that doesn’t interest people.

To paraphrase Elon Musk: Keep asking for criticism, especially negative, about
your idea. If your idea has merit, people will like it, and that will result
in a better chance of success. If you can't handle negative criticism, then
you need to work on your own skills and handling rejection.

Tricking people via this Priceonomics "pre-article-writing" trick just seems
like a trick.

~~~
marcus_holmes
people may say they like your idea, but will they buy it? What people say they
will do and what they actually do are two different things.

Testing to see whether people will actually click on an article is way more
informative than asking them if they'll click on it. It's not a trick, it's
testing customer behaviour rather than customer opinion.

~~~
kardos
> It's not a trick

But it is a trick, it's a variant of bait-and-switch where the bait didn't
even exist to begin with and they don't bother switching something in.
(Presumably-- the article doesn't say what the user sees after clicking the
fake link). If I clicked it, I would feel misled. In a sense it's harmless,
there's no shortage of misleading crap on the internet and this is just
another drop in the ocean. But it's definitely in bad faith. Payback for using
ad blockers?

~~~
marcus_holmes
yeah I guess in that sense it's a trick. But serves the buggers right for
lying in the first place!

If people would just say what they mean and mean what they say, we would have
nicer things ;)

------
squozzer
The idea itself (essentially pre-selling content) is not novel. Old print
magazines used to have ads for products that didn't exist, but would if the ad
generated enough interest.

------
devgutt
Interesting to notice that they are writing not what they believe, but what
they think what the readers will like. Very inspiring.

------
JoeAltmaier
Writers give a title and a chapter to an Editor to 'shop' a novel. This has
been going on for centuries.

------
bradknowles
Finally! Just what we've always needed!

An automated system to help teach stupid human beings how to write good click
bait headlines!

Okay folks, everyone else can go home now. The Internet has finally been
conclusively won!

------
AznHisoka
and what about the scrapers that scrape every page they can find online? now
when you publish it, google sees it as duplicate content.

~~~
mck-
Can be prevented with <meta name="robots" content="noindex">

------
joshdance
I think this is a good idea, but curious to see how it works. Do people see
who the ad is coming from? How many people usually see the ad?

------
tszming
Content farm will find it useful.

