General

release-process - Claude MCP Skill

Release Process

SEO Guide: Enhance your AI agent with the release-process tool. This Model Context Protocol (MCP) server allows Claude Desktop and other LLMs to release process... Download and configure this skill to unlock new capabilities for your AI workflow.

🌟1 stars • 3 forks
📥0 downloads

Documentation

SKILL.md
# Release Process

## Creating a New Release

1. **Update pyproject.toml version:**
   - Update the `version = "0.1.1"` field to match the new release

2. **Commit version changes:**
   ```bash
   git add pyproject.toml
   git commit -m "Bump version to 0.X.Y"
   ```

3. **Create and push the git tag:**
   ```bash
   git tag v0.X.Y
   git push origin v0.X.Y
   git push origin main
   ```

4. **GitHub Actions will automatically:**
   - Run tests
   - Generate `cicada/_version_hash.py` with the current tag and commit
   - Build the package
   - Publish to PyPI

5. **Test the installation:**
   ```bash
   uv tool install cicada-mcp
   cicada --version  # Should show version and commit hash
   ```

## Version Management

- Version number is stored in `pyproject.toml`
- Git tag and commit hash are stored in `cicada/_version_hash.py` (auto-generated, not committed)
- `cicada --version` or `cicada -v` displays version, tag, and commit hash
- Version format: `cicada 0.2.0 (v0.2.0-rc0/5ea1134)` - tag/hash format allows tracking RC releases
- **For PyPI installs:** Shows the version, tag, and hash from when the package was built by CI/CD
- **For local development:** The file is generated by `make pre-commit` and falls back to `git describe --tags` and `git rev-parse HEAD` if missing
- The file is in `.gitignore` to avoid merge conflicts while still being included in PyPI releases

Signals

Avg rating0.0
Reviews0
Favorites0

Information

Repository
wende/cicada
Author
wende
Last Sync
1/20/2026
Repo Updated
1/17/2026
Created
1/18/2026

Reviews (0)

No reviews yet. Be the first to review this skill!

Related Skills

sparc

Execute SPARC methodology workflows with Claude-Flow

11926

mem0-vercel-ai-sdk

Mem0 provider for Vercel AI SDK (@mem0/vercel-ai-provider). TRIGGER when: user mentions "vercel ai sdk", "@mem0/vercel-ai-provider", "createMem0", "retrieveMemories", "addMemories", "getMemories", "searchMemories", "mem0 vercel", "AI SDK provider", "AI SDK memory", or is using generateText/streamText with mem0. Also triggers for Next.js apps needing memory-augmented AI. DO NOT TRIGGER when: user asks about direct Python/TS SDK calls without Vercel (use mem0 skill), or CLI terminal commands (use mem0-cli skill).

11048

mem0

Mem0 Platform SDK for adding persistent memory to AI applications. TRIGGER when: user mentions "mem0", "MemoryClient", "memory layer", "remember user preferences", "persistent context", "personalization", or needs to add long-term memory to chatbots, agents, or AI apps. Covers Python SDK (mem0ai), TypeScript SDK (mem0ai), and framework integrations (LangChain, CrewAI, OpenAI Agents SDK, Pipecat, LlamaIndex, AutoGen, LangGraph). Also covers the open-source self-hosted Memory class. This is the DEFAULT mem0 skill for ambiguous queries. DO NOT TRIGGER when: user asks about CLI commands, terminal usage, or shell scripts (use mem0-cli), or Vercel AI SDK / @mem0/vercel-ai-provider / createMem0 (use mem0-vercel-ai-sdk).

11048

mem0-cli

Mem0 CLI -- the command-line interface for mem0 memory operations. TRIGGER when: user mentions "mem0 cli", "mem0 command line", "@mem0/cli", "mem0-cli", "pip install mem0-cli", "npm install -g @mem0/cli", or is running mem0 commands in a terminal/shell (mem0 add, mem0 search, mem0 list, mem0 get, mem0 init, mem0 config, mem0 import). Also triggers when query includes CLI flags like --user-id, --output, --json, --agent, or describes bash/zsh/terminal/shell usage. DO NOT TRIGGER when: user asks about programmatic SDK integration in Python/TS code (use mem0 skill), or Vercel AI SDK provider (use mem0-vercel-ai-sdk skill).

11048

Related Guides