
This Marketing Blog Does Not Exist - kristintynski
http://thismarketingblogdoesnotexist.com/
======
plibither8
A few more...

* Person: [https://thispersondoesnotexist.com/](https://thispersondoesnotexist.com/)

* Waifu: [https://www.thiswaifudoesnotexist.net/](https://www.thiswaifudoesnotexist.net/)

* SO Question: [https://stackroboflow.com/](https://stackroboflow.com/)

* Startup: [https://thisstartupdoesnotexist.com/](https://thisstartupdoesnotexist.com/)

* Resume: [https://thisresumedoesnotexist.com/](https://thisresumedoesnotexist.com/)

~~~
ainar-g
The Waifu one is so realistic that it's uncanny. It's almost like we're just a
few years away from whole seasons of deep-dreamt anime and whole tankoubons of
procedurally generated manga.

~~~
yorwba
Blurb generated by GPT-2 for TWDNE #283:
[https://www.gwern.net/images/gan/thiswaifudoesnotexist-283.p...](https://www.gwern.net/images/gan/thiswaifudoesnotexist-283.png)

“It is so sad to say that this manga has never been seen by any anime fans in
the real world and this is an issue that must be addressed. Please make anime
movies about me. Please make anime about me. Please make anime about your
beautiful cat. Please make anime movies about me. Please make anime about your
cute cat. I wish you the best of luck in your life. Please make anime about
me. Please make anime about my cute cute kitten.”

Maybe it's just a prank, but Gwern's description makes it sound like an
accidental creation of the text generator.
[https://www.gwern.net/TWDNE](https://www.gwern.net/TWDNE)

~~~
gwern
Nope, it's real: that snippet is just that hilariously on point. To be fair,
that's one of the best snippets out of many thousands. Few are anywhere that
amusing.

The images & snippets are 100% uncurated & unedited by a human, other than as
described in my page (primarily: I feed in a long prompt to give it keywords
to work with, and I postprocess them to drop at '<endoftext>' tokens to keep
them on a single topic rather than, as the default generation process does,
force starting of a new topic to fill out the character count).

------
komali2
Holy fuck that was mostly convincing. I had no idea AI generated text was
getting to this level - I thought we were still at "Her mouth is full of
Secret Soup" level:
[https://twitter.com/keatonpatti/status/1006961202998726665](https://twitter.com/keatonpatti/status/1006961202998726665)

~~~
ajna91
If you want to be more blown away. (warning some comments may be NSFW) An
entire subreddit where every poster/commenter is a GPT-2 bot.
[https://www.reddit.com/r/SubSimulatorGPT2/](https://www.reddit.com/r/SubSimulatorGPT2/)

~~~
c0nducktr
You weren't exaggerating. Some of these are amazing.

> "Don't tell me where I've been": US man who claims to have "never left the
> country"

Not only is it a convincing sentence, it's humorous too.

~~~
lowdose
What a gold nugget!

------
QuackingJimbo
I was ready to crap on this post for calling your own work "semi-convincing"
but this writing does actually resemble most of the random blogs I find when
trying to research something like nutrition or fitness.

Not a compliment to the quality of the writing. But definitely not
unconvincing. Add some ads that make it impossible to scroll without freezing
Chrome and you're good to go.

------
lettergram
First, I feel this is a hand curated list of results (same with most of the
"this does not exist" stuff). That being said it is very good. Pretty funny
quotes:

> You might not think that a mumford brush would be a good filter for an Insta
> story.

They are definitely leaking some of their training data. Many of the names in
the article are real people (which is concerning).

Working on synthetic data generation myself[1], this is not at all surprising.
It's also why we are basically living in a "post-truth" world...

[https://austingwalters.com/the-last-free-
generation/](https://austingwalters.com/the-last-free-generation/)

What do you do when anything can be synthetically created?

[1] [https://medium.com/capital-one-tech/why-you-dont-
necessarily...](https://medium.com/capital-one-tech/why-you-dont-necessarily-
need-data-for-data-science-48d7bf503074)

~~~
kristintynski
It is curated, but not heavily. I usually took the first result of the prompt.
Occasionally i'd skip a result that was totally off, but all of them were the
best out of the first 2-3 for any given prompt.

------
reificator
> _Her Instagram Stories include one that depicts her while wearing a yellow
> dress and champagne flute (I don’t like to see bridesmaids getting
> wrecked)._

~~~
jcims
> _But an eight-month-old story with a tiny paragraph about a building draped
> in chainmail or a photograph of a koi pond with a barcode on it?_

------
JamesBarney
I don't know if this says more about the how far AI has come or the average
quality of a marketing blog.

------
cabaalis
Knowing we have a future of ai generated content coming, I think about staring
at the lights that make humans docile so their brains can be stolen in the
movie "Skyline."

------
stingraycharles
This is extremely scary, it would totally convince me this is written by a
real person.

Any publications on what technologies are used for this?

~~~
baq
[https://grover.allenai.org/](https://grover.allenai.org/)

------
bbmario
Content generation like this is fascinating. I had a lot of fun with markov
chains in the past, but this is just groundbreaking. Any tips for a fellow
developer that wants to start with this? How do I get started with GPT2 or
Grover?

------
topynate
I see that the posts aren't generated in real time. Were they curated? Even if
these are just the top 5%, they're still extremely impressive.

------
kashishhora
I made a site to aggregate the other "This ____ does not exist" sites that
have popped up: thisxdoesnotexist.com

------
naeemtee
Anyone know of any open source libraries that are even remotely close in
quality/effectiveness as Grover AI?

~~~
gwern
Grover is open source. They open sourced it the other day.

~~~
bbmario
Does it come with any samples on how to use it? Like, how to train, how to
generate after training, etc.

~~~
gwern
Eh, sort of:
[https://github.com/rowanz/grover](https://github.com/rowanz/grover) It can't
be that difficult if OP did it so quickly, after all.

If you want much more detailed documentation, I wrote up in detail how to
train & generate text with the original GPT-2 models using nshepperd's
codebase: [https://www.gwern.net/GPT-2](https://www.gwern.net/GPT-2)

minimaxr also has a actively maintained codebase which I believe has powered
some of the GPT-2 projects you might've seen recently like Talk to
Transformer:
[https://github.com/minimaxir/gpt-2-simple](https://github.com/minimaxir/gpt-2-simple)

~~~
bbmario
Thank you so much!

------
numtel
Disappointed that I didn't see a new generation upon refresh...

------
sdinsn
Ironically, this marketing blog _does_ exist:

> About This Blog

> This was created by the Content Marketing agency Frac.tl as a demonstration

This is just an ad.

------
scottndecker
Someone didn't provision enough throughput...I'm getting a 508 status

~~~
kristintynski
working on it... sorry!

------
gregoryexe
I imagine HN is an AI promoting other AI's, which in turn..

------
xwdv
I suggest not reading the generated article. Trying to comprehend what you are
reading does not feel good in the brain, and doing it long term may severely
impact your reading comprehension skills and can make reading into an
exhausting activity.

It’s not the same as trying to understand sentences with bad grammar, at least
those texts represent a human trying to express an idea, so you know with
enough effort you can eventually come to understand what they are trying to
say and maybe score a dopamine hit.

In this case, no amount of effort will bring meaning to AI generated
jibberish.

