3sparks Chat
iOS Universel / Productivité
Whether you’re using LM Studio or Ollama to run LLMs locally on your Mac, PC, or Linux, or accessing the power of the OpenAI API, 3Sparks Chat is your go-to mobile client. Chat with your LLMs anytime, anywhere, while keeping control of your data and experience—optimized for iOS and VisionOS. We currently only support text conversations.
Quoi de neuf dans la dernière version ?
• Enhanced Code Display: Improved rendering of code blocks for better readability
• New Streaming Controls: Toggle streaming on/off to suit your needs
• Token Usage Insights: View detailed token usage statistics when streaming is disabled (perfect for LM Studio users)
• Performance Boost: Optimized app performance and refined user interface elements
We're constantly working to improve your experience. Thank you for using 3sparks Chat!