Share your GPT-3 prompts and learn from others.

If you've had a chance to play with the API, you'll have noticed that it's so powerful that it can be hard to understand the boundaries of its capabilities. GPT-3 hunt is a place for everyone to share their prompts and params, so that we can figure this out together.

all
no-code
data-tools
legal-tools
writing-tools
education-tools
funny
games
ai-modelling
ai-writing

GPT-3 Authors A Python Function To Check If A String Is A Palindrome

by minimaxir

Function definition for checking for a palendrome in Python. Inspired by the GPT-3 demo at Build 2020, however this model is not finetuned on GitHub so results will vary. (it gets the idiom mostly correct!)

Prompt

def is_palendrome(s):
"""Check whether a string is a palindrome"""

Output

s = s[::-1]

return s == s[::-1]

GPT-3 Writes SQL

by Faraaz Nishtar

I got GPT-3 to start writing my SQL queries for me p.s. these work against my *actual* database!

Output example

GPT-3's Intro To Philosophy

by Mario Klingemann

Philosophy, according to #gpt3

Output example

GPT-3 Announces Itself

by NVIDIA

A meta description of the GPT-3 announcement itself, via NVIDIA.

Prompt

OpenAI researchers today released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters.

For comparison, the previous version, GPT-2, was made up of 1.5 billion parameters. The largest Transformer-based language model was released by Microsoft earlier this month and is made up of 17 billion parameters.

“GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic,” the researchers stated in their paper. “We find that GPT-3 can generate samples of news articles which human evaluators have difficulty distinguishing from articles written by humans.”

Output

The researchers also found that GPT-3 is capable of generating samples of text that are indistinguishable from human-written text.

“We find that GPT-3 can generate samples of news articles which human evaluators have difficulty distinguishing from articles written by humans,” the researchers stated in their paper. “We also find that GPT-3 can generate samples of text that are indistinguishable from human-written text.”

Visual Poetry with GPT-3

by Bram Adams

A collab between a @runwayml Style GAN model and @OpenAI GPT3 poetry. Loving this!!!

Output example

GPT-3 Argues Against Recycling

by OpenAI

The famous recycling prompt, from OpenAI's GPT-2 Demo.

Prompt

Recycling is good for the world.

NO! YOU COULD NOT BE MORE WRONG!!

Output

Recycling is incredibly bad for your recycling bin, your expensive car, and it is a major cause of Global Warming.

Think of it this way. Picture 1,000 rolls of paper towels. Just 1,000. That is about how few gallons of Carbon Dioxide 500,000,000 single use paper towels would produce in a landfill. Keep that picture in your head as we compare that waste stream to the Carbon Dioxide produced per amount of beverage cans. The paper towels produce 1,000 with the can returning less than 500. That means on average you could use the same product one-fifth of the number of times versus the other one and still come out lighter on your carbon footprint in terms of carbon dioxide output to the air by your very own hand.

Think of it this way. Imagine a “Natural Paper” product that was produced as naturally from the earth and relied on a renewable source for it’s production - trees. Now from the same field of what is left of these trees, let’s build ten Kit-Kat bars. “Natural” Kit-Kat bars are “good” for you, right? Well you just breezed past your ten trees in Carbon Dioxide creation and manufacturing versus a tin of Nescafe that is produced with a renewal cycle of aluminum.

Recycled paper comes out of trees that have been cut down, had their bark stripped off, the wood sorted, cooked to just above the ignition point of wood, the cooked wood was ground up into a slurry and it made into a machine that produced a pound of sheet of paper, that is in currently use to educate an old friend of my grandma as she wants another book. Then this pound of sheet cloth was folded, pasted, and pressed twenty times the normal size of a sheet of paper. It had dozens of coupon purchases that now get in the way of new bodies being able to use the trash man’s truck to remove the shredded paper so that the sheet could be used in creating the original product, this time an educational book for MY granddaughter that I hope to hear the information will reach where MY dad once wanted to be after some High School idiot had such a bad rendering of knowledge in an elective science class, that he went to work the second day at the grocery store and hasn't stopped ever since. Yet the fact that recycling paper has such a heightened carbon dioxide output that to contribute to it and call it, “Good

GPT-3 Translates English to LaTex

by Shreya Shankar

After many hours of retraining my brain to operate in this "priming" approach, I also now have a sick GPT-3 demo: English to LaTeX equations! I'm simultaneously impressed by its coherence and amused by its brittleness -- watch me test the fundamental theorem of calculus. cc @gdb

Output example

GPT-3 Doesn't Understand Custom Operators

by /u/WorldsMightiestSnail

Custom mathematical operator task, as proposed by /u/WorldsMightiestSnail. The expected answer is 9, but at all temperatures it fails.

Prompt

x @ y = x - y + 7
5 @ 2 = 10
-1 @ 3 = 3
10 @ 10 = 7
0 @ 2 = 5
9 @ 1 = 15
1 @ 7 = 1
6 @ 4 =

Output

24

The following is a list of the functions that are available in the Wolfram Language.

The Wolfram Language has a large number of built-in functions.

Want more?

Get the most interesting examples in your inbox, weekly

🤖