Skip to content

Commit

Permalink
Python: Provide full list of samples in concepts README. (#10161)
Browse files Browse the repository at this point in the history
### Motivation and Context

Provide the full list of samples including hyperlinks in the
samples/concepts/README.md file instead of only the type of sample.

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

### Description

This PR:
- Provides the full list of samples including hyperlinks in the
samples/concepts/README.md file instead of only the type of sample.
- Renames the "structured_output" folder to be "structured_outputs"
which is the name OpenAI uses.
- Closes #6928

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone 😄
  • Loading branch information
moonbox3 authored Jan 13, 2025
1 parent fac3205 commit 3e4d8fc
Show file tree
Hide file tree
Showing 4 changed files with 152 additions and 27 deletions.
179 changes: 152 additions & 27 deletions python/samples/concepts/README.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,157 @@
# Semantic Kernel Concepts by Feature

This section contains code snippets that demonstrate the usage of Semantic Kernel features.

| Features | Description |
| -------- | ----------- |
| Agents | Creating and using [agents](../../semantic_kernel/agents/) in Semantic Kernel |
| Audio | Using services that support audio-to-text and text-to-audio conversion |
| AutoFunctionCalling | Using `Auto Function Calling` to allow function call capable models to invoke Kernel Functions automatically |
| ChatCompletion | Using [`ChatCompletion`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/chat_completion_client_base.py) messaging capable service with models |
| ChatHistory | Using and serializing the [`ChatHistory`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/contents/chat_history.py) |
| Filtering | Creating and using Filters |
| Functions | Invoking [`Method`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_function_from_method.py) or [`Prompt`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_function_from_prompt.py) functions with [`Kernel`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/kernel.py) |
| Grounding | An example of how to perform LLM grounding |
| Local Models | Using the [`OpenAI connector`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion.py) and [`OnnxGenAI connector`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/onnx/services/onnx_gen_ai_chat_completion.py) to talk to models hosted locally in Ollama, OnnxGenAI and LM Studio |
| Logging | Showing how to set up logging |
| Memory | Using [`Memory`](https://github.com/microsoft/semantic-kernel/tree/main/dotnet/src/SemanticKernel.Abstractions/Memory) AI concepts |
| Model-as-a-Service | Using models deployed as [`serverless APIs on Azure AI Studio`](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio) to benchmark model performance against open-source datasets |
| On Your Data | Examples of using AzureOpenAI [`On Your Data`](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data?tabs=mongo-db) |
| Planners | Showing the uses of [`Planners`](https://github.com/microsoft/semantic-kernel/tree/main/python/semantic_kernel/planners) |
| Plugins | Different ways of creating and using [`Plugins`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_plugin.py) |
| Processes | Examples of using the [`Process Framework`](../../semantic_kernel/processes/) |
| PromptTemplates | Using [`Templates`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/prompt_template/prompt_template_base.py) with parametrization for `Prompt` rendering |
| RAG | Different ways of `RAG` (Retrieval-Augmented Generation) |
| Search | Using search services information |
| Service Selector | Shows how to create and use a custom service selector class. |
| Setup | How to setup environment variables for Semantic Kernel |
| Structured Output | How to leverage OpenAI's json_schema structured output functionality. |
| TextGeneration | Using [`TextGeneration`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/text_completion_client_base.py) capable service with models |
## Table of Contents

### Agents - Creating and using [agents](../../semantic_kernel/agents/) in Semantic Kernel

- [Assistant Agent Chart Maker](./agents/assistant_agent_chart_maker.py)
- [Assistant Agent File Manipulation](./agents/assistant_agent_file_manipulation.py)
- [Assistant Agent File Manipulation Streaming](./agents/assistant_agent_file_manipulation_streaming.py)
- [Assistant Agent Retrieval](./agents/assistant_agent_retrieval.py)
- [Assistant Agent Streaming](./agents/assistant_agent_streaming.py)
- [Chat Completion Function Termination](./agents/chat_completion_function_termination.py)
- [Mixed Chat Agents](./agents/mixed_chat_agents.py)
- [Mixed Chat Agents Plugins](./agents/mixed_chat_agents_plugins.py)
- [Mixed Chat Files](./agents/mixed_chat_files.py)
- [Mixed Chat Reset](./agents/mixed_chat_reset.py)
- [Mixed Chat Streaming](./agents/mixed_chat_streaming.py)

### Audio - Using services that support audio-to-text and text-to-audio conversion

- [Chat with Audio Input](./audio/01-chat_with_audio_input.py)
- [Chat with Audio Output](./audio/02-chat_with_audio_output.py)
- [Chat with Audio Input and Output](./audio/03-chat_with_audio_input_output.py)
- [Audio Player](./audio/audio_player.py)
- [Audio Recorder](./audio/audio_recorder.py)

### AutoFunctionCalling - Using `Auto Function Calling` to allow function call capable models to invoke Kernel Functions automatically

- [Azure Python Code Interpreter Function Calling](./auto_function_calling/azure_python_code_interpreter_function_calling.py)
- [Function Calling with Required Type](./auto_function_calling/function_calling_with_required_type.py)
- [Parallel Function Calling](./auto_function_calling/parallel_function_calling.py)
- [Chat Completion with Auto Function Calling Streaming](./auto_function_calling/chat_completion_with_auto_function_calling_streaming.py)
- [Functions Defined in JSON Prompt](./auto_function_calling/functions_defined_in_json_prompt.py)
- [Chat Completion with Manual Function Calling Streaming](./auto_function_calling/chat_completion_with_manual_function_calling_streaming.py)
- [Functions Defined in YAML Prompt](./auto_function_calling/functions_defined_in_yaml_prompt.py)
- [Chat Completion with Auto Function Calling](./auto_function_calling/chat_completion_with_auto_function_calling.py)
- [Chat Completion with Manual Function Calling](./auto_function_calling/chat_completion_with_manual_function_calling.py)
- [Nexus Raven](./auto_function_calling/nexus_raven.py)

### ChatCompletion - Using [`ChatCompletion`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/chat_completion_client_base.py) messaging capable service with models

- [Simple Chatbot](./chat_completion/simple_chatbot.py)
- [Simple Chatbot Kernel Function](./chat_completion/simple_chatbot_kernel_function.py)
- [Simple Chatbot Logit Bias](./chat_completion/simple_chatbot_logit_bias.py)
- [Simple Chatbot Store Metadata](./chat_completion/simple_chatbot_store_metadata.py)
- [Simple Chatbot Streaming](./chat_completion/simple_chatbot_streaming.py)
- [Simple Chatbot with Image](./chat_completion/simple_chatbot_with_image.py)

### ChatHistory - Using and serializing the [`ChatHistory`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/contents/chat_history.py)

- [Serialize Chat History](./chat_history/serialize_chat_history.py)

### Filtering - Creating and using Filters

- [Auto Function Invoke Filters](./filtering/auto_function_invoke_filters.py)
- [Function Invocation Filters](./filtering/function_invocation_filters.py)
- [Function Invocation Filters Stream](./filtering/function_invocation_filters_stream.py)
- [Prompt Filters](./filtering/prompt_filters.py)
- [Retry with Filters](./filtering/retry_with_filters.py)

### Functions - Invoking [`Method`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_function_from_method.py) or [`Prompt`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_function_from_prompt.py) functions with [`Kernel`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/kernel.py)

- [Kernel Arguments](./functions/kernel_arguments.py)

### Grounding - An example of how to perform LLM grounding

- [Grounded](./grounding/grounded.py)

### Local Models - Using the [`OpenAI connector`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion.py) and [`OnnxGenAI connector`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/onnx/services/onnx_gen_ai_chat_completion.py) to talk to models hosted locally in Ollama, OnnxGenAI, and LM Studio

- [ONNX Chat Completion](./local_models/onnx_chat_completion.py)
- [LM Studio Text Embedding](./local_models/lm_studio_text_embedding.py)
- [LM Studio Chat Completion](./local_models/lm_studio_chat_completion.py)
- [ONNX Phi3 Vision Completion](./local_models/onnx_phi3_vision_completion.py)
- [Ollama Chat Completion](./local_models/ollama_chat_completion.py)
- [ONNX Text Completion](./local_models/onnx_text_completion.py)

### Logging - Showing how to set up logging

- [Setup Logging](./logging/setup_logging.py)

### Memory - Using [`Memory`](https://github.com/microsoft/semantic-kernel/tree/main/dotnet/src/SemanticKernel.Abstractions/Memory) AI concepts

- [Azure Cognitive Search Memory](./memory/azure_cognitive_search_memory.py)
- [Memory Data Models](./memory/data_models.py)
- [New Memory](./memory/new_memory.py)
- [Pandas Memory](./memory/pandas_memory.py)

### Model-as-a-Service - Using models deployed as [`serverless APIs on Azure AI Studio`](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio) to benchmark model performance against open-source datasets

- [MMLU Model Evaluation](./model_as_a_service/mmlu_model_eval.py)

### On Your Data - Examples of using AzureOpenAI [`On Your Data`](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data?tabs=mongo-db)

- [Azure Chat GPT with Data API](./on_your_data/azure_chat_gpt_with_data_api.py)
- [Azure Chat GPT with Data API Function Calling](./on_your_data/azure_chat_gpt_with_data_api_function_calling.py)
- [Azure Chat GPT with Data API Vector Search](./on_your_data/azure_chat_gpt_with_data_api_vector_search.py)

### Planners - Showing the uses of [`Planners`](https://github.com/microsoft/semantic-kernel/tree/main/python/semantic_kernel/planners)

- [Sequential Planner](./planners/sequential_planner.py)
- [OpenAI Function Calling Stepwise Planner](./planners/openai_function_calling_stepwise_planner.py)
- [Azure OpenAI Function Calling Stepwise Planner](./planners/azure_openai_function_calling_stepwise_planner.py)

### Plugins - Different ways of creating and using [`Plugins`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/functions/kernel_plugin.py)

- [Azure Key Vault Settings](./plugins/azure_key_vault_settings.py)
- [Azure Python Code Interpreter](./plugins/azure_python_code_interpreter.py)
- [OpenAI Function Calling with Custom Plugin](./plugins/openai_function_calling_with_custom_plugin.py)
- [Plugins from Directory](./plugins/plugins_from_dir.py)

### Processes - Examples of using the [`Process Framework`](../../semantic_kernel/processes/)

- [Cycles with Fan-In](./processes/cycles_with_fan_in.py)
- [Nested Process](./processes/nested_process.py)

### PromptTemplates - Using [`Templates`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/prompt_template/prompt_template_base.py) with parametrization for `Prompt` rendering

- [Template Language](./prompt_templates/template_language.py)
- [Azure Chat GPT API Jinja2](./prompt_templates/azure_chat_gpt_api_jinja2.py)
- [Load YAML Prompt](./prompt_templates/load_yaml_prompt.py)
- [Azure Chat GPT API Handlebars](./prompt_templates/azure_chat_gpt_api_handlebars.py)
- [Configuring Prompts](./prompt_templates/configuring_prompts.py)

### RAG - Different ways of `RAG` (Retrieval-Augmented Generation)

- [RAG with Text Memory Plugin](./rag/rag_with_text_memory_plugin.py)
- [Self-Critique RAG](./rag/self-critique_rag.py)

### Search - Using [`Search`](https://github.com/microsoft/semantic-kernel/tree/main/python/semantic_kernel/connectors/search) services information

- [Bing Search Plugin](./search/bing_search_plugin.py)
- [Bing Text Search](./search/bing_text_search.py)
- [Bing Text Search as Plugin](./search/bing_text_search_as_plugin.py)
- [Google Search Plugin](./search/google_search_plugin.py)
- [Google Text Search as Plugin](./search/google_text_search_as_plugin.py)

### Service Selector - Shows how to create and use a custom service selector class

- [Custom Service Selector](./service_selector/custom_service_selector.py)

### Setup - How to set up environment variables for Semantic Kernel

- [OpenAI Environment Setup](./setup/openai_env_setup.py)
- [Chat Completion Services](./setup/chat_completion_services.py)

### Structured Outputs - How to leverage OpenAI's json_schema [`Structured Outputs`](https://platform.openai.com/docs/guides/structured-outputs) functionality

- [JSON Structured Outputs](./structured_outputs/json_structured_outputs.py)
- [JSON Structured Outputs Function Calling](./structured_outputs/json_structured_outputs_function_calling.py)

### TextGeneration - Using [`TextGeneration`](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/ai/text_completion_client_base.py) capable service with models

- [Text Completion Client](./local_models/onnx_text_completion.py)

# Configuring the Kernel

Expand Down

0 comments on commit 3e4d8fc

Please sign in to comment.