Get Help #194
chenxizhang
started this conversation in
Use cases
Get Help
#194
Replies: 2 comments
-
Get-Help New-ImageGeneration -FullNAME
New-ImageGeneration
SYNOPSIS
Generate image from prompt, using DALL-e-3 model.
SYNTAX
New-ImageGeneration [[-prompt] <String>] [-api_key <String>] [-endpoint <String>] -azure
[-size <String>] [-outfolder <String>] [-environment <String>] [<CommonParameters>]
DESCRIPTION
Generate image from prompt, using DALL-e-3 model. The image size can be 1024x1024,
1792x1024, 1024x1792.
PARAMETERS
-prompt <String>
The prompt to generate image, this is required, and it can pass from pipeline. If
you want to use a file as prompt, you can specify the file path here. You can also
specify a url as prompt, we will read the url as prompt. You can read the prompt
from a library (https://github.com/code365opensource/promptlibrary), by use
"lib:xxxxx" as the prompt, for example, "lib:fitness".
Required? false
Position? 1
Default value None
Accept pipeline input? False
Accept wildcard characters? false
-api_key <String>
The API key to access OpenAI service, if not specified, the API key will be read
from environment variable OPENAI_API_KEY. if you use Azure OpenAI service, you can
specify the API key by environment variable OPENAI_API_KEY_AZURE or
OPENAI_API_KEY_AZURE_<environment>, the <environment> can be any names you want, for
example, OPENAI_API_KEY_AZURE_DEV, OPENAI_API_KEY_AZURE_PROD,
OPENAI_API_KEY_AZURE_TEST, etc.
Required? false
Position? named
Default value None
Accept pipeline input? False
Accept wildcard characters? false
-endpoint <String>
The endpoint to access OpenAI service, if not specified, the endpoint will be read
from environment variable OPENAI_ENDPOINT. if you use Azure OpenAI service, you can
specify the endpoint by environment variable OPENAI_ENDPOINT_AZURE or
OPENAI_ENDPOINT_AZURE_<environment>, the <environment> can be any names you want,
for example, OPENAI_ENDPOINT_AZURE_DEV, OPENAI_ENDPOINT_AZURE_PROD,
OPENAI_ENDPOINT_AZURE_TEST, etc.
Required? false
Position? named
Default value None
Accept pipeline input? False
Accept wildcard characters? false
-azure [<SwitchParameter>]
Use Azure OpenAI service, if specified, the API key and endpoint will be read from
environment variable OPENAI_API_KEY_AZURE or OPENAI_API_KEY_AZURE_<environment>, the
<environment> can be any names you want, for example, OPENAI_API_KEY_AZURE_DEV,
OPENAI_API_KEY_AZURE_PROD, OPENAI_API_KEY_AZURE_TEST, etc. and OPENAI_ENDPOINT_AZURE
or OPENAI_ENDPOINT_AZURE_<environment>.
Required? true
Position? named
Default value False
Accept pipeline input? False
Accept wildcard characters? false
-size <String>
The size of the image to generate, the value can be small (1024x1024),
medium(1792x1024), large(1024x1792), the default is small.
Required? false
Position? named
Default value Small
Accept pipeline input? False
Accept wildcard characters? false
-outfolder <String>
The folder to save the generated image, default is current folder. You can use out
as the alias of this parameter.
Required? false
Position? named
Default value .
Accept pipeline input? False
Accept wildcard characters? false
-environment <String>
The environment name, if you use Azure OpenAI service, you can specify the
environment by this parameter, the environment name can be any names you want, for
example, dev, prod, test, etc, the environment name will be used to read the API key
and endpoint from environment variable, for example, OPENAI_API_KEY_AZURE_DEV,
OPENAI_ENDPOINT_AZURE_DEV, etc. You can use env as the alias of this parameter.
Required? false
Position? named
Default value None
Accept pipeline input? False
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https://go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
System.String, the file path of the generated image.
NOTES
-------------------------- EXAMPLE 1 --------------------------
New-ImageGeneration -prompt "A painting of a cat sitting on a chair"
Use dall-e-3 model to generate image, the image size is 1024x1024, the generated image
will be saved to current folder.
-------------------------- EXAMPLE 2 --------------------------
image -prompt "A painting of a cat sitting on a chair"
Use the alias (image) to generate image, the image size is 1024x1024, the generated
image will be saved to current folder.
-------------------------- EXAMPLE 3 --------------------------
"A painting of a cat sitting on a chair" | New-ImageGeneration
Pass the prompt from pipeline, the image size is 1024x1024, the generated image will be
saved to current folder.
-------------------------- EXAMPLE 4 --------------------------
New-ImageGeneration -prompt "A painting of a cat sitting on a chair" -size medium
-outfolder "c:\temp" -api_key "your API key" -endpoint "your endpoint"
Use dall-e-3 model to generate image, the image size is 1792x1024, the generated image
will be saved to c:\temp folder, use your own API key and endpoint.
-------------------------- EXAMPLE 5 --------------------------
New-ImageGeneration -prompt "A painting of a cat sitting on a chair" -size small
-outfolder "c:\temp" -azure
Use dall-e-3 model to generate image, the image size is 1024x1024, the generated image
will be saved to c:\temp folder, use Azure OpenAI service.
-------------------------- EXAMPLE 6 --------------------------
New-ImageGeneration -prompt "A painting of a cat sitting on a chair" -size small
-outfolder "c:\temp" -azure -environment "dev"
Use dall-e-3 model to generate image, the image size is 1024x1024, the generated image
will be saved to c:\temp folder, use Azure OpenAI service, read API key and endpoint
from environment variable OPENAI_API_KEY_AZURE_DEV and OPENAI_ENDPOINT_AZURE_DEV.
-------------------------- EXAMPLE 7 --------------------------
New-ImageGeneration -outfolder "c:\temp" -azure -prompt "c:\temp\prompt.txt"
Use dall-e-3 model to generate image, the image size is 1024x1024, the generated image
will be saved to c:\temp folder, use Azure OpenAI service, and use prompt from file
c:\temp\prompt.txt
RELATED LINKS
https://github.com/chenxizhang/openai-powershell |
Beta Was this translation helpful? Give feedback.
0 replies
-
Get-Help New-ChatCompletions -Full |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
You can use the below cmdlets to get full help.
Get-Help New-ChatGPTConversation -Full
NAME New-ChatGPTConversation SYNOPSIS Create a new ChatGPT conversation or get a Chat Completion result if you specify the prompt parameter directly. SYNTAX New-ChatGPTConversation [[-api_key] <String>] [[-model] <String>] [[-endpoint] <String>] [[-system] <String>] [[-config] <PSObject>] [[-outFile] <String>] [-json] [[-context] <PSObject>] [[-headers] <PSObject>] [[-functions] <String[]>] [[-environment] <String>] [[-env_config] <String>] [<CommonParameters>] DESCRIPTION Create a new ChatGPT conversation, You can chat with the OpenAI service just like chat with a human. You can also get the chat completion result if you specify the prompt parameter. PARAMETERS -api_key <String> The API key to access OpenAI service, if not specified, the API key will be read from environment variable OPENAI_API_KEY. You can also use "token" or "access_token" or "accesstoken" as the alias. Required? false Position? 1 Default value Accept pipeline input? false Accept wildcard characters? false -model <String> The model to use for this request, you can also set it in environment variable OPENAI_API_MODEL. If you are using Azure OpenAI Service, the model should be the deployment name you created in portal. Required? false Position? 2 Default value Accept pipeline input? false Accept wildcard characters? false -endpoint <String> The endpoint to use for this request, you can also set it in environment variable OPENAI_API_ENDPOINT. You can also use some special value to specify the endpoint, like "ollama", "local", "kimi", "zhipu". Required? false Position? 3 Default value Accept pipeline input? false Accept wildcard characters? false -system <String> The system prompt, this is a string, you can use it to define the role you want it be, for example, "You are a chatbot, please answer the user's question according to the user's language." If you provide a file path to this parameter, we will read the file as the system prompt. You can also specify a url to this parameter, we will read the url as the system prompt. You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness". Required? false Position? 4 Default value You are a chatbot, please answer the user's question according to the user's language. Accept pipeline input? false Accept wildcard characters? false -config <PSObject> The dynamic settings for the API call, it can meet all the requirement for each model. please pass a custom object to this parameter, like @{temperature=1;max_tokens=1024}. Required? false Position? 5 Default value Accept pipeline input? false Accept wildcard characters? false -outFile <String> If you want to save the result to a file, you can use this parameter to set the file path. You can also use "out" as the alias. Required? false Position? 6 Default value Accept pipeline input? false Accept wildcard characters? false -json [<SwitchParameter>] Send the response in json format. Required? false Position? named Default value False Accept pipeline input? false Accept wildcard characters? false -context <PSObject> If you want to pass some dymamic value to the prompt, you can use the context parameter here. It can be anything, you just specify a custom powershell object here. You define the variables in the system prompt or user prompt by using {{you_variable_name}} syntext, and then pass the data to the context parameter, like @{you_variable_name="your value"}. if there are multiple variables, you can use @{variable1="value1";variable2="value2"}. Required? false Position? 7 Default value Accept pipeline input? false Accept wildcard characters? false -headers <PSObject> If you want to pass some custom headers to the API call, you can use this parameter. You can pass a custom hashtable to this parameter, like @{header1="value1";header2="value2"}. Required? false Position? 8 Default value Accept pipeline input? false Accept wildcard characters? false -functions <String[]> This is s super powerful feature to support the function_call of OpenAI, you can specify the function name(s) and it will be automatically called when the assistant needs it. You can find all the avaliable functions definition here (h ttps://raw.githubusercontent.com/chenxizhang/openai-powershell/master/code365scripts.openai/Private/functions.json). Required? false Position? 9 Default value Accept pipeline input? false Accept wildcard characters? false -environment <String> If you have multiple environment to use, you can specify the environment name here, and then define the environment in the profile.json file. You can also use "profile" or "env" as the alias. Required? false Position? 10 Default value Accept pipeline input? false Accept wildcard characters? false -env_config <String> The profile.json file path, the default value is "$env:USERPROFILE/.openai-powershell/profile.json". Required? false Position? 11 Default value "$env:USERPROFILE/.openai-powershell/profile.json" Accept pipeline input? false Accept wildcard characters? false <CommonParameters> This cmdlet supports the common parameters: Verbose, Debug, ErrorAction, ErrorVariable, WarningAction, WarningVariable, OutBuffer, PipelineVariable, and OutVariable. For more information, see about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216). INPUTS OUTPUTS System.String, the completion result. -------------------------- EXAMPLE 1 -------------------------- PS C:\>New-ChatGPTConversation Use OpenAI Service with all the default settings, will read the API key from environment variable (OPENAI_API_KEY), enter the chat mode. -------------------------- EXAMPLE 2 -------------------------- PS C:\>New-ChatGPTConversation -api_key "your api key" -model "gpt-3.5-turbo" Use OpenAI Service with the specified api key and model, enter the chat mode. -------------------------- EXAMPLE 3 -------------------------- PS C:\>chat -system "You help me to translate the text to Chinese." Use OpenAI Service to translate text (system prompt specified), will read the API key from environment variable (OPENAI_API_KEY), enter the chat mode. -------------------------- EXAMPLE 4 -------------------------- PS C:\>chat -endpoint "ollama" -model "llama3" Use OpenAI Service with the local model, enter the chat mode. -------------------------- EXAMPLE 5 -------------------------- PS C:\>chat -endpoint $endpoint $env:OPENAI_API_ENDPOINT_AZURE -model $env:OPENAI_API_MODEL_AZURE -api_key $env:OPENAI_API_KEY_AZURE Use Azure OpenAI Service with the specified api key and model, enter the chat mode. -------------------------- EXAMPLE 6 -------------------------- PS C:\>gpt -system "Translate the text to Chinese." -prompt "Hello, how are you?" Use OpenAI Service to translate text (system prompt specified), will read the API key from environment variable (OPENAI_API_KEY), model from OPENAI_API_MODEL (if present) or use "gpt-3.5-turbo" as default, get the chat completion result directly. -------------------------- EXAMPLE 7 -------------------------- PS C:\>"Hello, how are you?" | gpt -system "Translate the text to Chinese." Use OpenAI Service to translate text (system prompt specified, user prompt will pass from pipeline), will read the API key from environment variable (OPENAI_API_KEY), model from OPENAI_API_MODEL (if present) or use "gpt-3.5-turbo" as default, get the chat completion result directly. RELATED LINKS https://github.com/chenxizhang/openai-powershellBeta Was this translation helpful? Give feedback.
All reactions