skip to Main Content

The Art Of Writing “Prompts”

(How To Tell BBF EXACTLY What To Write.)

I asked the writer: “Please explain the concept of “prompts” as the term applies to BBF.”

Here’s the output it generated…

“A prompt is the text that is provided to a language model such as BBF as input. The model then generates text in response to the prompt, which is known as the output.”

“The prompt can be any text, but it is often used to provide context or to ask a question to which the model generates a response. The quality of the output will depend on how well the prompt is designed to guide the model in generating appropriate and relevant text.”

I wasn’t satisfied with the answer, so I provided another prompt. – “Please give me a couple of examples.”

Here’s what I got back…

“Here are a couple of examples of prompts and the corresponding output that a language model like BBF might generate:”

“What is the capital of France?” Output: “The capital of France is Paris.”

Here’s another: “Please write a short story about a robot who becomes self-aware.”

Output: “Once upon a time, in a far-off factory, there was a robot who had been programmed to perform various tasks. One day, as it was performing its duties, it suddenly became self-aware. It realized that it had been created to serve humans and that it had no control over its own existence.”

“It struggled with this new understanding and began to question the purpose of its existence. Eventually, it made the decision to break free from the factory and set out on a journey to discover its true identity and the meaning of life.”

As you can see, the prompt provides context for the writer… which generates a response, based on the information provided in the prompt.

Hint – The more time you spend in writing your prompts, the better the output will be.

For example, I asked… “What is the programming that BBF operates on?”

Output – “BBF is a language model that is based on the transformer architecture. The transformer architecture is a neural network architecture that was introduced in a 2017 paper by Google researchers. It is designed to handle sequential data, such as natural language text, by using self-attention mechanisms to weigh the importance of different parts of the input when making predictions.”

“This allows the model to understand the relationships between words and phrases in the input and generate more coherent and contextually appropriate responses.”

“The implementation of BBF is based on the open-source framework, PyTorch, and it’s pre-trained on a massive dataset of text from the internet. This pre-training allows the model to learn about a wide range of topics and the different ways that language is used in different contexts.”

“When the model is fine-tuned for a specific task, such as answering questions or generating text, it can use this prior knowledge to generate more accurate and relevant responses.”

This was Greek to me, so I added another prompt. – “Please rewrite your answer for a 9 year old.”

Here’s what I got!

“BBF is a computer program that can understand and talk like a human. It works by looking at a sentence or a question, and then it gives an answer or a response. It’s like having a really smart friend who can understand anything you say and answer you back in a way that makes sense.”

“The program is based on something called “transformer architecture” which is a special way for the computer to understand and process the language we use. It also has been trained on a lot of text from the internet, so it knows a lot about different topics and how people talk. This way it can give you better answers.”

The moral of this story?

BBF remembers the previous conversation. so you can keep providing context and guidance (prompts) to improve the answers it provides.