Dify v0.3.29: Support for GPT-4 Turbo & Vision - The Gateway to Multimodal Interactions
Nov 8, 2023
v0.3.29 Feature Highlights:
GPT-4 Turbo & Vision model support
External data API tools
Moderation for sensitive content
GPT-4 Turbo & Vision Integration:
Dify.AI has swiftly integrated OpenAI's newly released GPT-4 Turbo, the most powerful LLM available now with 128K context. Even more exciting is our upcoming expansion into multimodal abilities, enabling features like GPT-4 Vision for multi-image cognition.
API Tools - External Data Access in Real Time
Previously, developers could easily upload text, structured data and documents to Dify using the dataset management. Now, the External Data API tools not only sync internal data like querying databases or leveraging internal search, but also external sources such as weather, Google results or Amazon product info. This enhances efficiency and security, especially for sensitive data. Customizing internal search allows tailoring to your specific application's needs, ensuring personalization and accuracy for optimal user experience.For self-hosted Dify developers, in addition to API-based Extensions, Code-based Extensions in Python can expand capabilities without altering existing code or building separate API services. >>Learn more
Moderation for Customized AI Safety:
To further enable secure, compliant AI interactions, we've added Moderation in the newest Dify release. Set up checks in 3 ways: OpenAI moderation, custom sensitive word filters, or call API extensions for personalized moderation. Dual input/output filtering with preset responses when triggered allows you to tailor moderation to your app's needs. >>Learn more
Eager for more? We are actively developing multimodal functionalities that will soon revolutionize the way we interact with AI.