Intro to Streamlit
- Simple and complex Streamlit example
- Data and state management in Streamlit apps
- Data widgets for Streamlit apps
- Deploying Streamlit apps
import EasyCrypt
keystring = "SixteenByteKey!!"
inpstring = "Some super secret string, that I don't want you to see."
# This is the initialisation vector/nonce. I generated it with the below code. As you
# will need it to decrypt later on, you might want to store it and not just generate it each time
# I just generated it like this and printed this one out to store it.
#
# import os
# from binascii import hexlify, unhexlify
# ivstring = hexlify(os.urandom(16)).decode()
ivstring = "aba0a3bde34a03487eda3ec96d5736a8"
crypted = EasyCrypt.encrypt_string(keystring, inpstring, ivstring)
print(crypted)
decrypted = EasyCrypt.decrypt_string(keystring, crypted, ivstring)
print(decrypted)
# main.py
import json
from pydantic import BaseModel, EmailStr, ValidationError, validator
class Employee(BaseModel):
name: str
age: int
email: EmailStr
department: str
employee_id: str
@validator("employee_id")
def validate_employee_id(cls, v):
if not v.isalnum() or len(v) != 6:
raise ValueError("Employee ID must be exactly 6 alphanumeric characters")
return v
# Load and parse the JSON data
with open("employees.json", "r") as f:
data = json.load(f)
# Validate each employee record
for record in data:
try:
employee = Employee(**record)
print(f"Valid employee record: {employee.name}")
except ValidationError as e:
print(f"Invalid employee record: {record 'name' » }")
print(f"Errors: {e.errors()}"
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I love you'))"
The TextWrapper class provides functionality for wrapping long pieces of text into multiple shorter lines while preserving the initial and subsequent indents.
- Shows how an individual can use Python for speech recognition (SR), Push-to-- Talk (PTT) systems, and large action models to create their own AI assistant.
- Describes the process of creating a "la Rabbit prototype" with Python code on Raspberry Pi hardware.
- Emphasizes how these components can be combined for various tasks using different APIs like OpenAI or LLaMA (Large Language Model from Meta).
A prototype tool powered by Large Language Models to make querying your databases as easy as saying the word.
- Introduction to QueryGPT, a tool using Large Language Models (LLMs) for natural language database queries
- Focus on implementing a basic iteration of the system, with potential for significant enhancements
- Aim is to provide the LLM with the database schema and have it answer questions based on that context
- Discussion on prompt engineering, which is designing inputs for generative AI tools to produce optimal results
Quick reference for the significant changes introduced with each new version of Python. It covers changes to Python syntax and the standard library, as well as valuable tools, links, and utilities that can aid with upgrading code bases.