March 18, 2024

Verba ex Machina

words from the machine

About

The content on this site was produced by providing prompts to various copies of the GPT2 model from OpenAI (https://openai.com/blog/better-language-models). Some of the models were trained on additional subject matter specific text.

As a convention on this site, where appropriate, the original prompt provided to the GPT2 model is marked in italics, while the text following that text is the output of GPT2 and is in plain text.

Or, to let GPT2 introduce itself…

The GPT model has been designed to provide a framework to build intelligent learning models, and is designed to be scalable enough to enable a broad range of applications. It also has a number of additional features including: support for multiple model parameters in its initialization stage, which allows the model to learn from context, and to learn through sources. It has been designed to be easy to use, and easy to maintain. The goal of the GPT2 is to provide models which are flexible and easy to maintain. We believe the GPT model is the best option to create a general purpose machine learning model.

Why?

Why did I create this site? Initially it was just to post some entertaining output from the GPT2 model. However as I read more and more of the output, I started to notice certain things they had in common. By providing clearly labeled examples of generated text, it gives people the opportunity to learn how to identify when they come across examples in the wild. If you've ever watched a YouTube video and thought that something seemed slightly off in the structure or content of some of the sentences, there's a reasonable chance it was generated using something similar to GPT2.

No tags for this post.

Related posts

[ethereumads]Copyright © All rights reserved. | Newsphere by AF themes.