# AI Recommendations API Source: https://docs.dappier.com/ai_recommendations_api Convert any data into a recommendation engine with Dappier's AI-powered Article Recommendations API. This API uses a RAG (Retrieval-Augmented Generation) model to return article recommendations based on the input query or URL. You'll need an API key to access this API. Visit [Dappier Platform](https://platform.dappier.com) to sign up and create an API key under Settings > Profile > API Keys. #### Using AI Recommendations API This section demonstrates retrieving recommendations based on natural language queries or URLs. You can get the data model id of the desired data model from the [Dappier Marketplace](https://marketplace.dappier.com). The data model id starts with `dm_`. Api reference for this endpoint can be found [here](https://docs.dappier.com/api-reference/endpoint/real-time-search).
```python Python import requests import json url = "https://api.dappier.com/app/datamodel/dm_01j0pb465keqmatq9k83dthx34" payload = json.dumps({ "query": "lifestyle new articles", "similarity_top_k": 3, "ref": "", "num_articles_ref": 0, "search_algorithm": "most_recent", "page": 1, "num_results" : 10 }) headers = { 'Content-Type': 'application/json', 'Authorization': 'Bearer ' } response = requests.post(url, headers=headers, data=payload) print(response.text) ``` ```bash Go package main import ( "bytes" "fmt" "net/http" "io/ioutil" ) func main() { url := "https://api.dappier.com/app/datamodel/dm_01j0pb465keqmatq9k83dthx34" apiKey := "" query := `{ "query": "lifestyle new articles", "similarity_top_k": 3, "ref": "", "num_articles_ref": 0, "search_algorithm": "most_recent", "page": 1, "num_results" : 10 }` req, err := http.NewRequest("POST", url, bytes.NewBuffer([]byte(query))) if err != nil { fmt.Println("Error creating request:", err) return } req.Header.Set("Authorization", "Bearer "+apiKey) req.Header.Set("Content-Type", "application/json") client := &http.Client{} resp, err := client.Do(req) if err != nil { fmt.Println("Error making request:", err) return } defer resp.Body.Close() body, _ := ioutil.ReadAll(resp.Body) fmt.Println(string(body)) } ``` ```bash cURL curl --location --request POST 'https://api.dappier.com/app/datamodel/dm_01j0pb465keqmatq9k83dthx34' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer ' \ --data-raw '{ "query": "lifestyle new articles", "similarity_top_k": 3, "ref": "", "num_articles_ref": 0, "search_algorithm" : "most_recent", "page": 1, "num_results" : 10 }' ``` ```json response { "results": [ { "author": "Rusty Weiss", "image_url": "https://images.dappier.com/dm_01j0pb465keqmatq9k83dthx34/Olympics-Rugby-Sevens-Women-Semifinal-USA-vs-NZL-23861886-scaled-e1725556056406_.jpg?width=428&height=321", "preview_content": "

American rugby player Ilona Maher’s popularity is at its absolute peak. Demand for her has skyrocketed since her performance in […]

\n

The post Olympic Rugby Hero Ilona Maher Joins ‘Dancing With The Stars’ appeared first on Bounding Into Sports.

", "pubdate": "Thu, 05 Sep 2024 17:30:27 +0000", "site": "Bounding Into Sports", "site_domain": "www.boundingintosports.com", "title": "Olympic Rugby Hero Ilona Maher Joins ‘Dancing With The Stars’", "url": "https://api.dappier.com/app/track/NB2HI4DTHIXS653XO4XGE33VNZSGS3THNFXHI33TOBXXE5DTFZRW63JPGIYDENBPGA4S633MPFWXA2LDFVZHKZ3CPEWWQZLSN4WWS3DPNZQS23LBNBSXELLKN5UW44ZNMRQW4Y3JNZTS253JORUC25DIMUWXG5DBOJZS6===?type=article_click&site_domain=www.boundingintosports.com&datamodel_id=dm_01j0pb465keqmatq9k83dthx34&request_id=3ac73c01974c-kqHLh38Cqy-2824988&origin=" }, ... ] } ``` # AI Recommendations API Source: https://docs.dappier.com/api-reference/endpoint/ai-recommendations post /app/v2/search Retrieve articles based on queries or URLs, powered by semantic search. # Real Time Search Source: https://docs.dappier.com/api-reference/endpoint/real-time-search post /app/aimodel/{ai_model_id} Access real-time search with Dappier's AI models — from live web search and breaking news to stock prices and financial insights. Fast, reliable, and ready for monetization. # Bot Deterrence Source: https://docs.dappier.com/bot-deterrence Bots are automated programs that perform tasks on the internet. While some bots, like search engine crawlers, are beneficial, others, like scrapers and spammers, can harm your site. Bad bots can cause massive problems for web properties. Too much bot traffic can put a heavy load on web servers, slowing or denying service to legitimate users (DDoS attacks are an extreme version of this scenario). Bad bots can also scrape or download content from a website, steal user credentials, take over user accounts, rapidly post spam content, and perform various other kinds of attacks. Bot management is necessary to prevent these performance and security impacts on websites, applications, and APIs, by leveraging a range of security, machine learning, and web development technologies to accurately detect bots and block malicious activity while allowing legitimate bots to operate uninterrupted. **What are the bot detection techniques?** Some of the popular bot detection techniques include * Browser fingerprinting – this refers to information that is gathered about a computing device for identification purposes (any browser will pass on specific data points to the connected website’s servers, such as your operating system, language, plugins, fonts, hardware, etc.) * Browser consistency – checking the presence of specific features that should or should not be in a browser. This can be done by executing certain JavaScript requests. * Behavioral inconsistencies – this involves behavioral analysis of nonlinear mouse movements, rapid button and mouse clicks, repetitive patterns, average page time, average requests per page, and similar bot behavior. * CAPTCHA – a popular anti-bot measure, is a challenge-response type of test that often asks you to fill in correct codes or identify objects in pictures. **Best Practices for Bot Deterrence** * Create a robots.txt file for your website. A good starting point might be to provide crawling instructions for bots accessing your website's resources. See these examples of Google's robots.txt file. * Implement CAPTCHAs: Use CAPTCHAs to distinguish between human users and bots. * Rate Limiting: Set limits on the number of requests a user can make in a given timeframe. * Regular Monitoring: Continuously monitor traffic and update your security settings to stay ahead of bot activity. * Set up a web application firewall (WAF). WAFs can be used to filter out suspicious requests and block IP addresses based on various factors * Use honeypot traps. Honeypots are specifically designed to attract unwanted or malicious bots, allowing websites to detect bots and ban their IP addresses. **Cloudflare Bot Solutions** Cloudflare Bot Solutions offer comprehensive protection against malicious bots that can disrupt your website's performance and security. By leveraging Cloudflare's global network and advanced machine learning algorithms, you can effectively detect and deter unwanted bot traffic, ensuring a seamless experience for your legitimate users. Key Features: 1. Bot Fight Mode: * Active Defense: Automatically detects and mitigates bot traffic by deploying challenge-response tests. * Ease of Use: Simple to enable from the Cloudflare dashboard, providing immediate protection without complex configurations. 2. Bot Management: * Machine Learning Algorithms: Uses sophisticated machine learning to differentiate between good and bad bots. * Behavioral Analysis: Monitors traffic patterns to identify and block suspicious activities in real-time. * Custom Rules: Allows you to create custom rules to address specific bot threats tailored to your business needs. 3. Threat Intelligence: * Global Insights: Utilizes data from Cloudflare's extensive network to stay ahead of evolving bot threats. * Updated Databases: Regularly updates bot databases to include new and emerging bot patterns, ensuring up-to-date protection. 4. Analytics and Reporting: * Detailed Insights: Provides comprehensive analytics and reports on bot traffic, helping you understand the impact and adjust defenses accordingly. * User-friendly Dashboard: Easy-to-navigate dashboard where you can monitor bot activity and manage your settings. **How to Implement Cloudflare Bot Solutions** * Sign Up and Add Your Site: Create an account on Cloudflare and add your domain. Follow the prompts to configure your DNS settings. * Enable Bot Fight Mode: Navigate to the Firewall section in the Cloudflare dashboard and toggle on Bot Fight Mode. This feature will start protecting your site immediately. ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-bot-navigate.png) Activate Bot Fight Mode and Block AI Scrapers and Crawlers if not enabled already ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-bot-enable.png) * Configure Bot Management: Access the Bot Management section to set up advanced rules and customize your bot protection strategy. Use the provided analytics to monitor bot activity and adjust settings as needed. * Monitor and Adjust: Regularly check the Cloudflare dashboard for updates on bot traffic and make necessary adjustments to your bot management rules to keep your site protected against new threats. ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-monitor-1.png) The dashboard gives information about threats and bots identified by Cloudflare ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-monitor-2.png) **Bot detection using AWS WAF** AWS WAF is a web application firewall that helps protect your web applications from common web exploits that could affect application availability, compromise security, or consume excessive resources. With AWS WAF, you can create rules to filter web traffic based on various conditions, including IP addresses, HTTP headers, URI strings, and the origin of the requests. To begin using AWS WAF, follow these steps: 1. Create a Web ACL (Access Control List): Navigate to the AWS WAF & Shield console. ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-3.png) Create a new Web ACL and associate it with your CloudFront distribution, API Gateway, or ALB (Application Load Balancer). ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-2.png) ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-4.png) Add AWS resources that need to be monitored ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-5.png) 2. Add Rules to Detect Bots: * User-Agent Filtering: Add a custom rule to inspect the User-Agent header. You can block requests with known bot User-Agent strings or allow only specific User-Agent strings. * Rate-Based Rules: Use rate-based rules to limit the number of requests from a single IP address. This can help mitigate bots that generate high volumes of traffic. * AWS Managed Rules: Utilize AWS Managed Rules for bots and scraping detection. AWS offers a Bot Control rule group that specifically targets known bot traffic. ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-6.png) You can choose AWS Managed Rules to begin with and configure to block bots depending on configurations. ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-7.png) * Click on “Add to web ACL” and edit to add required configurations ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-8.png) * You can choose multiple categories based on your requirements to block the bots ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-9.png) * Once the rules are added, monitoring can be done using the bot control dashboard ![](https://mintlify.s3.us-west-1.amazonaws.com/dappier-20/images/cf-waf-10.png) **Boilerplate for robots.txt** In the below example, currently known AI data scrapers and undocumented AI agents are blocked. You can use it as a starting point and manually customize it as needed. ``` User-agent: Applebot-Extended Disallow: / User-agent: Bytespider Disallow: / User-agent: CCBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Diffbot Disallow: / User-agent: FacebookBot Disallow: / User-agent: Google-Extended Disallow: / User-agent: GPTBot Disallow: / User-agent: Meta-ExternalAgent Disallow: / User-agent: omgili Disallow: / User-agent: Timpibot Disallow: / User-agent: anthropic-ai Disallow: / User-agent: Claude-Web Disallow: / User-agent: cohere-ai Disallow: / ``` You can validate robots.txt using [this](https://technicalseo.com/tools/robots-txt/) online service. # Configure Your AI Agent Source: https://docs.dappier.com/configure-ai-agent Welcome to Dappier's Create AI Agent tool! With this feature, you can configure your AI Agent to suit your needs. To begin, navigate to [My AI Agents](https://platform.dappier.com/my-ai) and click on **Create AI Agent**. With just a few steps you can fine tune your AI Agent. Here’s how: ### Sync Your Content Get started with building your AI agent with knowledge sources such as RSS feeds or Airtable. Create AI Agent Sync Your Content #### RSS Feeds Integration Train your AI agent by connecting it to one or more RSS feeds. These feeds will serve as valuable sources of information for your agent to learn from. Your AI agent will parse the content of these feeds and use it to improve its understanding and responses. #### Airtable Integration Alternatively, you can integrate your AI agent with Airtable, a powerful database management tool. To set up this integration, you'll need to provide the following details: BaseID, TableID(s), and a Personal Access Token (PAT). This information allows your AI agent to access and retrieve data from your Airtable database. For detailed instructions on obtaining these details from your Airtable account, refer to the following link: [Creating Personal Access Tokens](https://support.airtable.com/docs/creating-personal-access-tokens) > **Note:** Fine-tuning your AI agent is a crucial process that ensures optimal performance and accuracy in its responses. This process may take a few minutes to complete, depending on the complexity of your agent and the amount of data being processed. It's recommended to initiate the fine-tuning process and return shortly to check on its progress. This allows the AI agent to iteratively adjust its algorithms and parameters to better understand and respond to user queries effectively. Your patience during this period is appreciated as it contributes to the overall improvement of your AI agent's performance. Once you've added your content source, you can continue configuring your agent with the following: Create AI Agent Configure ### Name of AI Agent Choose a name that reflects the purpose or function of your AI agent. This name will help users identify and interact with your agent effectively. ### Description of What This AI Agent Can Do Provide a brief overview of the capabilities and functionalities of your AI agent. This description will help users understand the scope and purpose of the agent. ### Instructions & Persona Specify the instructions and persona that you want your AI agent to follow. Define the behavior, tone, and style that align with your brand or objectives. Whether you want your agent to be formal, friendly, or informative, setting clear instructions and persona will ensure consistent and effective interactions with users. > *Here is an example we like: Your name is Pat, and you are a real-time data AI assistant with access to real-time data. Your replies are kind, fun, truthful, and informal, and you can use emojis. Respond to all prompts ideal for text messaging. No need to greet yourself or the user in every response.* ### And that’s it! You are ready to fine tune your AI Agent. You can always tweak your configurations and edit your content sources to further update your agent. ### Enter Prompt Samples (Optional) Offer example prompts to guide effective user interactions with the AI agent. Include a variety of clear and concise samples covering different topics and actions. Encourage exploration of the agent's capabilities and gather user feedback for continuous improvement. > **Note:** You can enter your prompt samples after you begin fine tuning your AI agent. # Configure AskAI Source: https://docs.dappier.com/configure-ask-ai # AskAI Widget Customization Fields Dappier’s fully white-labeled, responsive AskAI widget seamlessly integrates into any website, delivering AI-powered answers, insights, and personalized interactions. With extensive customization options, you can tailor the widget’s appearance, behavior, and functionality to match your brand’s identity and meet user needs. Use the following options to customize the widget as needed and create a seamless, engaging user experience. From advanced styling settings to flexible configurations for content recommendations and tracking, the AskAI widget offers unparalleled adaptability to elevate your website’s engagement and user experience. ## 1. General * **Title**: Title text of the widget (Default: "Ask AI"). * **Search Placeholder Text**: Placeholder text in the search field (Default: "Ask a question..."). * **Ask Button Text**: Text displayed on the "Ask" button (Default: "Ask", Max 6 characters). * **Version**: Select the version of the widget to use (Default: latest). * **API Key**: Select the API key to use for the widget. * **Subdomain**: Specify the subdomain for the widget. ## 2. Appearance * **Main Logo URL**: Specify a URL for the widget's main logo that appears on the top right if enabled. * **Chat Icon URL**: Specify a URL for the widget's chat icon. * **Main Background Color**: The background color of the widget (Default: `#F2F2F2`). * **Ask Button Color**: The color of the "Ask" button (Default: `#674AD9`). * **Prompt Suggestion Background Color**: Background color for prompt suggestions (Default: `#674AD9`). * **Prompt Suggestion Text Color**: Text color for prompt suggestions (Default: `#FFFFFF`). * **Message Background Color**: Background color for chat messages (Default: `#FFFFFF`). * **Message Text Color**: Text color for chat messages (Default: `#000000`). * **Title Color**: Text color of the widget title (Default: `#3C3838`). * **Container Border Radius**: Radius for the widget container's corners (Default: `0px`). * **Element Border Radius**: Radius for widget elements (Default: `4px`). * **Main Logo Width (Mobile)**: Width of your main logo when displayed on mobile devices (Default: `45px`). * **Chat Icon Width (Mobile)**: Size of the chat icon when displayed on mobile devices (Default: `16px`). * **Main Logo Width (Desktop)**: Width of your main logo when displayed on desktop devices (Default: `90px`). * **Chat Icon Width (Desktop)**: Size of the chat icon when displayed on desktop devices (Default: `31px`). * **Font Size Header (Mobile)**: Font size of the header when displayed on mobile devices (Default: `16px`). * **Font Size Default (Mobile)**: Default font size of the widget elements when displayed on mobile devices (Default: `16px`). * **Font Size Header (Desktop)**: Font size of the header when displayed on desktop devices (Default: `16px`). * **Font Size Default (Desktop)**: Default font size of the widget elements when displayed on desktop devices (Default: `16px`). * **Fixed Mobile Height**: Set the widget height to a fixed value when displayed on mobile devices (Default: `350px`). * **Expandable Desktop Height**: Set the widget height to expand to the specified value when displayed on desktop devices (Default: `500px`). ## 3. Behavior * **Enable Title**: Toggle to display the widget title (Default: Enabled). * **Enable Prompt Suggestions**: Toggle to display prompt suggestions within the widget (Default: Enabled). * **Enable Opening Content Links in New Window**: Toggle to open related content in a new window in the chat response (Default: Enabled). * **Enable Content Recommendations**: Toggle to show content recommendations (Default: Enabled).\ *Note*: When enabled, the widget height will increase by `130px` on desktop and `120px` on mobile devices. * **Enable Site Name**: Toggle to display the site name in content recommendations (Default: Enabled). * **Show “Powered by Dappier” label**: Toggle to display “Powered by Dappier” label in the widget footer (Default: Enabled). ## 4. Advanced * **Referring URL**: A URL to refer to when interacting with the widget. * **Disclaimer Link**: A link to the widget's disclaimer. * **Custom Data**: Additional custom data that can be sent along with the widget for tracking purposes. Needs to be configured in the widget embed code.\ *Example*: `customData='[{"placementType": "top", "name": "sidewidget"}]'` * **Initial Search Query**: A predefined search query when the widget loads. Useful for embedding the AskAI Widget in a search results page. Needs to be configured in the widget embed code.\ *Example*: `initialSearchQuery="news articles"` # 🪄 Activepieces AI TweetBot: Auto-Tweet Viral Content Using Dappier, OpenAI, and Twitter Source: https://docs.dappier.com/cookbook/recipes/activepieces-auto-tweet-real-time-news-workflow You can also import this template directly from here - [https://cloud.activepieces.com/templates/KOs0dyWetOZorKPQBiH96](https://cloud.activepieces.com/templates/KOs0dyWetOZorKPQBiH96) This notebook demonstrates how to create a fully automated Tweet generator using **Activepieces**, **Dappier**, and **OpenAI**. By combining real-time trending insights with natural language generation, this automation crafts and posts tweets—autonomously and repeatedly. In this notebook, you'll explore: * **Activepieces**: A no-code, open-source automation platform that allows you to run scheduled flows with intelligent decision logic, integrations, and multi-step workflows. * **Dappier**: A platform connecting LLMs and automations to real-time, rights-cleared data from trusted sources. It specializes in domains like trending news, AI research, finance, and lifestyle, delivering prompt-ready data with up-to-date insights. * **OpenAI**: Used here to transform raw data into engaging tweet-sized content, leveraging the creative power of GPT-4o. * **Twitter**: Publishes the final tweet live to your feed—automatically and without human intervention. This setup not only demonstrates a practical use case of autonomous content generation but also provides a blueprint for automating other types of social media content with real-time intelligence and LLMs. ## ⏰ Schedule-Based Trigger Setup To kick off the automation, we’ll use a time-based trigger that runs every few minutes to check for trending content and post a tweet. ### Step 1: Set the Trigger – Every X Minutes Search for the **Schedule** piece and select the **Every X Minutes** trigger. Configure it as follows: * **Frequency**: `Every 1 minute` *(You can adjust this value depending on how frequently you want to post content—e.g., every 15 minutes, 1 hour, etc.)* This trigger ensures the workflow runs at regular intervals without manual input. Once set, the automation is live and ticking on a timed loop, ready to pull new content and generate tweets. ## 🌐 Fetch Trending Content using Dappier With the schedule trigger in place, the next step is to retrieve real-time trending news content that will serve as the basis for the tweet. ### Step 2: Add Dappier – Real Time Data Add a **Dappier** action and choose the `Real Time Data` action. Configure the following: * **Query**: ```text Latest trending news in artificial intelligence, AI, and machine learning this week ``` This will fetch up-to-date and rights-cleared news and insights from trusted sources in the AI domain. The response is structured and optimized for downstream LLM consumption. ## ✍️ Generate Tweet Using OpenAI Now that you’ve retrieved trending AI news using Dappier, the next step is to craft an engaging tweet using OpenAI's powerful language model. ### Step 3: Add OpenAI – Ask ChatGPT Add an **OpenAI** action and choose `Ask ChatGPT`. Configure the following settings: * **Model**: `chatgpt-4o-latest` * **Prompt**: ```text Write a casual engaging tweet that is related to the below content. Keep it short under 280 characters. {{dappier_response}} ``` * **Max Tokens**: 2048 * **Temperature**: 0.9 * **Presence Penalty**: 0.6 * **Frequency Penalty**: 0 This prompt encourages the model to generate light, human-like content optimized for virality—fitting Twitter’s tone and length constraints. Here is the next section, maintaining the same style and completeness: *** ## 🐦 Post Tweet to Twitter Automatically Once OpenAI generates the tweet content, the final step is to post it directly to your Twitter account using the Twitter piece. ### Step 4: Add Twitter – Create Tweet Add a **Twitter** action and choose `Create Tweet`. Configure the following: * **Text**: ```text {{generated_tweet_text}} ``` Use the dynamic output from the previous OpenAI step to populate this field. * **Authentication**: Connect your Twitter account via OAuth when prompted. Once set, every time this workflow runs, it will autonomously tweet engaging content based on the most recent AI news—no manual effort required. ## 🌟 Highlights This notebook has guided you through building a fully automated social media workflow that tweets real-time AI news content using Activepieces, OpenAI, Dappier, and Twitter. You’ve seen how to schedule automation, retrieve trending data, generate engaging tweets, and post them—all without human intervention. Key tools utilized in this notebook include: * **Activepieces**: A powerful no-code automation platform that enables scheduled workflows, intelligent decision logic, and seamless integration with third-party apps like Twitter. * **Dappier**: A platform connecting LLMs and automation tools to real-time, rights-cleared data from trusted sources, specializing in trending news, finance, and lifestyle. It delivers enriched, prompt-ready data, ideal for dynamic content generation. * **OpenAI**: Used here for transforming trending content into engaging, human-like tweets using GPT-4o. * **Twitter**: The final publishing channel where the generated tweets are posted automatically. This setup is easily extendable—you can adapt it to tweet content from different domains (e.g., sports, finance, health) or trigger it based on events, RSS feeds, or even Gmail requests. # 🪄 Activepieces Stock Market Analyst Bot: Automate Investment Research with Real-Time Data & AI-Powered Reports Source: https://docs.dappier.com/cookbook/recipes/activepieces-stock-market-workflow You can also import this template directly from here - [https://cloud.activepieces.com/templates/fb3b8HFenLKGNxe8BRJhA](https://cloud.activepieces.com/templates/fb3b8HFenLKGNxe8BRJhA) This notebook demonstrates how to set up and leverage Activepieces’ automation platform combined with Dappier and OpenAI for stock market analysis. By combining real-time data, AI reasoning, and workflow automation, this notebook walks you through an innovative approach to creating fully automated investment research reports. In this notebook, you'll explore: * **Activepieces**: A powerful no-code automation platform that enables multi-step workflows with triggers, conditional routing, and app integrations—without writing a single line of code. * **Dappier**: A platform connecting LLMs and automation tools to real-time, rights-cleared data from trusted sources, specializing in domains like stock market, finance, and news. It delivers enriched, prompt-ready data, empowering AI with verified and up-to-date information. * **OpenAI**: A leading provider of advanced AI models capable of natural language understanding, contextual reasoning, and content generation. It enables intelligent, human-like interactions and supports a wide range of applications across various domains. * **Gmail**: Email-based interaction point that triggers this automation and serves as the delivery channel for the final investment report. This setup not only demonstrates a practical application of AI-driven stock market analysis but also provides a flexible framework that can be adapted to other real-world scenarios requiring real-time data integration, automation logic, and AI-powered summarization. ## ⚙️ Starting with Setup in Activepieces To get started, head over to [Activepieces](https://cloud.activepieces.com) and create a **new flow**. This bot will trigger when a new email is received in your Gmail inbox, analyze the query using AI, fetch real-time financial data from Dappier, and send a full investment report via email. ### Step 1: Set the Trigger – New Email in Gmail Search for the **Gmail** piece and choose the **New Email** trigger. Configure the following: * **Authentication**: Connect your Gmail account. * **Label**: Set to `INBOX` to monitor all incoming messages. * **Category**: Leave it empty to include all categories. Once this trigger is set, your automation will activate whenever a new email lands in your inbox. ### Step 2: Determine Relevance – Is It a Stock Query? Add a new **OpenAI** action immediately after the trigger. Configure it as follows: * **Model**: `chatgpt-4o-latest` * **Prompt**: ```text Return only a boolean value: true or false. Determine whether the following query mentions a stock ticker symbol or company name related to the stock market. Query: {{trigger['message']['text']}} ``` * **Max Tokens**: 2048 * **Temperature**: 0.1 ## 🔀 Conditional Routing & Ticker Extraction ### Step 3: Add a Router to Branch Logic Use the **Router** to evaluate the result from the previous OpenAI step. Configure two branches: * **Branch Name**: `Stock Analysis` * **Condition**: ```text {{step_2}} exactly matches "true" ``` * **Branch Name**: `Otherwise` * This fallback branch will do nothing or exit the flow if the email isn't related to a stock query. ### Step 4: Extract the Stock Ticker Symbol Under the `Stock Analysis` branch, add a new **OpenAI** action. Configure it as follows: * **Model**: `chatgpt-4o-latest` * **Prompt**: ```text Extract the stock ticker symbol from the following query. Respond with only the ticker symbol—no explanations, no extra text. query: {{trigger['message']['text']}} ``` * **Max Tokens**: 2048 * **Temperature**: 0.1 This action ensures you only extract the relevant stock ticker (e.g., `AAPL` from “Tell me about Apple”). ## 📊 Real-Time Financial Data Retrieval using Dappier With the extracted stock ticker, we'll now gather structured financial insights using **Dappier's real-time models**. For each query below, make sure to use the **`Real Time Data`** action from the Dappier piece. ### Step 5: Get Company Overview Add a **Dappier** action and select `Real Time Data`. Use the following query: ```text Latest company overview for {{extracted_ticker}} include company profile, industry, sector, CEO, headquarters location, employee count, market capitalization and stock ticker symbol. ``` This gives a detailed snapshot of the company and its basic metadata. *** ### Step 6: Get Financial Performance Metrics Add another **Dappier** action (again using `Real Time Data`) and enter: ```text Extract latest financial performance data for {{extracted_ticker}}, including Revenue (TTM), Net Income (TTM), Year-over-Year revenue growth, gross margin, and recent quarterly trends ``` This returns structured financial performance metrics and key indicators. *** ### Step 7: Competitive Benchmarking Add a third **Dappier** action (also using `Real Time Data`) with the query: ```text Identify 3-5 peer companies in the same sector as {{extracted_ticker}}. Extract key metrics such as P/E ratio, revenue, stock price, and market cap. ``` This enables side-by-side comparison with industry peers. ## 📈 Real-Time Stock Snapshot & Market News We’ll now use Dappier’s specialized **Stock Market Data** action to retrieve detailed market-level insights and the latest categorized financial news. ### Step 8: Get Real-Time Stock Snapshot Add a **Dappier** action and select `Stock Market Data`. Use the following query: ```text Include latest current price with %daily change, volume, 52-week high/low, P/E ratio, EPS, dividend yield, and chart data for 1D, 5D, 1M, YTD, and 1Y for {{extracted_ticker}} ``` This returns a structured snapshot of the company's live trading metrics and historical movement. *** ### Step 9: Get Categorized Financial News Add another **Dappier** action (also using `Stock Market Data`) with the following query: ```text Fetch latest finanical news for {{extracted_ticker}}. Include Earnings, Analyst Ratings, Market Moves, Partnerships, and Legal/Regulatory. ``` This returns categorized, real-time news summaries, perfect for building context in your final investment report. ## 🧠 Generate Investment Report with OpenAI Now that we've collected all the required data—company profile, financial performance, peer benchmarking, real-time stock snapshot, and categorized news—we'll synthesize everything into a polished, readable investment report using OpenAI. ### Step 10: Compile the Full Report Add an **OpenAI** action and configure it with the following: * **Model**: `chatgpt-4o-latest` * **Prompt**: ````text Compile a comprehensive investment report for **{{extracted_ticker}}**, formatted as a single HTML email body. Synthesize the outputs of all tasks: company overview, financial performance, competitive benchmarking, real-time stock snapshot, and categorized financial news. The report must include the following structured sections: 1. **Quick Summary** A concise AI-generated summary of {{extracted_ticker}} (e.g., "Apple is a global tech leader…"). 2. **Company Profile Table** Industry, Sector, CEO, HQ Location, Employees, Market Cap. 3. **Financial Performance Table** Revenue (TTM), Net Income (TTM), YoY Growth, Gross Margin. Include a short narrative on key trends below the table. 4. **Competitive Benchmarking Table** Compare {{extracted_ticker}} against 3–5 peers using: P/E, Revenue, Stock Price, Market Cap. 5. **Real-Time Stock Snapshot** Include: Price, % Change, Volume, 52-week High/Low, P/E, EPS, Dividend Yield. Present chart summaries for 1D, 5D, 1M, YTD, and 1Y as plain text descriptions or inline image URLs (if available). 6. **Categorized Financial News** Include 5 categories: - Earnings - Analyst Ratings - Market Moves - Partnerships - Legal/Regulatory Tag each news item with sentiment: 🟢 Positive, 🟡 Neutral, 🔴 Negative. 7. **Insight Section** Write 2–3 paragraphs with: - What’s going on with {{company_name}} - Why it matters - Outlook (*clearly labeled as not financial advice*) **Important formatting instructions**: - Use semantic HTML (e.g., `

`, ``, `

`) - Apply all styles via **inline CSS only** - Ensure all links are clickable (``) - Keep it responsive and compatible with major email clients (no JavaScript, no external assets) - Don't include ```html in the output. All the related information is below: {{company_overview_response}} {{financial_performance_response}} {{peer_comparison_response}} {{stock_snapshot_response}} {{news_response}} ```` This action will generate a clean, richly structured report in semantic HTML format—ready to be delivered by email. ## 📤 Delivering the Report via Gmail Now that the investment report is generated, let’s send it back to the original sender as a formatted HTML email using Gmail. ### Step 11: Send the Email Add a **Gmail** action and choose `Send Email`. Configure the email fields as follows: * **To**: ```text {{email_sender_address}} ``` Use the dynamic reference to send it back to the email originator: ```text {{trigger['message']['from']['value'][0]['address']}} ``` * **Subject**: ```text Stock Market Analysis of {{extracted_ticker}} ``` * **Body**: ```text {{investment_report_html}} ``` * **Body Type**: `HTML` * **CC/BCC/Reply-To**: Leave these empty unless needed. * **Draft**: Set to `false` to send the email immediately. Once configured, the flow will respond to every stock-related query with a professional-grade investment report, fully automated. ## 🌟 Highlights This notebook has guided you through building a fully automated stock market analysis workflow using Activepieces, OpenAI, Dappier, and Gmail. You’ve seen how to classify queries, extract ticker symbols, fetch real-time financial data, generate detailed reports, and deliver them directly to email—all without writing a single line of backend code. Key tools utilized in this notebook include: * **Activepieces**: A powerful no-code automation platform that enables app-triggered workflows with conditional logic, AI actions, and app integrations. * **OpenAI**: A leading provider of advanced AI models used here to classify stock queries, extract ticker symbols, and generate investment reports in rich HTML format. * **Dappier**: A platform connecting LLMs and automation tools to real-time, rights-cleared data from trusted sources, specializing in domains like stock market, finance, and news. It delivers enriched, prompt-ready data, empowering automations with verified and up-to-date information. * **Gmail**: Serves as both the input trigger and output channel, making the workflow seamlessly email-driven. This comprehensive setup allows you to adapt and expand the example for various scenarios requiring stock research, financial insights, or real-time data–driven automation. # ⚙️ Build Smarter Agent.ai Agents with Dappier’s Real-Time, Verified Data Models Source: https://docs.dappier.com/cookbook/recipes/agent-ai-create-dappier-agent [**Agent.ai**](http://Agent.ai) is a professional network and marketplace for AI agents—and the people who love them. It allows users to discover, connect with, and hire a variety of AI agents to perform useful tasks, serving as a hub for agent-based collaborations and innovations. [**Dappier**](https://dappier.com/developers/) is a platform that connects LLMs and Agentic AI agents to real-time, rights-cleared data from trusted sources, including web search, finance, and news. By providing enriched, prompt-ready data, Dappier empowers AI with verified and up-to-date information for a wide range of applications. This guide provides a step-by-step process for extracting real-time search data from Dappier RAG models into a new Agent on the Agent.ai platform. Follow the instructions carefully to ensure a successful setup. For the purpose of this guide, we are using Dappier’s Real Time Data RAG model, available here: [https://marketplace.dappier.com/marketplace/real-time-search](https://marketplace.dappier.com/marketplace/real-time-search) ## Watch the Video Guide If you prefer a visual walkthrough, check out the accompanying video guide below: */} ## 📦 Installation To get started, install the required tools and dependencies: **Step 1: Install `uv` (required to run the Dappier MCP server)** **macOS / Linux**: ```bash curl -LsSf https://astral.sh/uv/install.sh | sh ``` **Windows**: ```bash powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" ``` *** **Step 2: Install Python Packages** Install OpenAI Agents SDK. ```bash pip install openai-agents ``` ## 🔑 Setting Up API Keys You’ll need API keys for both **Dappier** and **OpenAI** to authenticate your requests and access tools. **Dappier API Key** Head to [Dappier](https://platform.dappier.com/profile/api-keys) to sign up and generate your API key. Dappier offers free credits to get started. You can set your API key as an environment variable in your terminal: ```bash export DAPPIER_API_KEY=your-dappier-api-key ``` Or programmatically in your Python script: ```python Python import os os.environ["DAPPIER_API_KEY"] = "your-dappier-api-key" ``` *** **OpenAI API Key** Visit [OpenAI](https://platform.openai.com/account/api-keys) to retrieve your API key. Set it in your terminal: ```bash export OPENAI_API_KEY=your-openai-api-key ``` Or in your Python script: ```python Python os.environ["OPENAI_API_KEY"] = "your-openai-api-key" ``` ## ⚙️ Import Dependencies Start by importing all required modules to build the travel planner agent. This includes components from the OpenAI Agents SDK and the Dappier MCP server. ```python Python import asyncio import shutil import os from datetime import datetime from agents import Agent, Runner, trace from agents.mcp import MCPServer, MCPServerStdio from openai.types.responses import ResponseTextDeltaEvent ``` These imports enable: * Running the MCP server locally via `MCPServerStdio` * Tracing and managing the agent's execution with `Runner` and `trace` * Streaming the assistant's output using `ResponseTextDeltaEvent` ## 📝 Define User Input We’ll collect basic trip preferences from the user: city, number of days, and travel start date. ```python Python def get_user_input(): city = input("Enter the city for your travel: ").strip() num_days = input("Enter the number of days for the trip: ").strip() travel_date = input("Enter the start date of travel (YYYY-MM-DD): ").strip() return city, num_days, travel_date ``` ## 🛰️ Run the Agent with Dappier MCP This function sets up the agent, formulates the user query, and streams the response using tools served via Dappier MCP. ```python Python async def run(mcp_server: MCPServer, city: str, num_days: str, travel_date: str): agent = Agent( name="TravelPlanner", instructions=""" You are a dynamic travel planning assistant. Your goal is to build a customized travel itinerary. Use Dappier's tools to get any real time information. """, mcp_servers=[mcp_server], model="gpt-4o-mini" ) query = f""" Generate a {num_days}-day travel itinerary for {city}, tailored to the real-time weather forecast for the selected date: {travel_date}. Follow these steps: Determine Current Date and Travel Period: Use Dappier's real-time search to identify the current date and calculate the trip duration based on user input. Fetch Weather Data: Retrieve the weather forecast for {city} during the selected dates to understand the conditions for each day. Fetch Live Events Data: Use Dappier's real-time search to find live events happening in {city} during the trip dates. Fetch Hotel Deals Data: Use Dappier's real-time search to find the best hotel deals with booking links in {city} during the trip dates. Design the Itinerary: Use the weather insights, live events, hotel deals to plan activities and destinations that suit the expected conditions. For each suggested location: Output: Present a detailed {num_days}-day itinerary, including timing, activities, booking links, weather information for each day and travel considerations. Ensure the plan is optimized for convenience and enjoyment. """ print("\n" + "-" * 40) print(f"Running itinerary generation for {city} starting {travel_date} for {num_days} days") print("\n=== Streaming Start ===\n") result = Runner.run_streamed(agent, query) async for event in result.stream_events(): if event.type == "raw_response_event" and isinstance(event.data, ResponseTextDeltaEvent): print(event.data.delta, end="", flush=True) print("\n\n=== Streaming Complete ===\n") ``` ## 🚦 Initialize and Launch the Workflow The `main()` function sets up the Dappier MCP server, enables tracing for observability, and runs the travel planning agent. ```python Python async def main(): city, num_days, travel_date = get_user_input() async with MCPServerStdio( cache_tools_list=True, params={ "command": "uvx", "args": ["dappier-mcp"], "env": {"DAPPIER_API_KEY": os.environ["DAPPIER_API_KEY"]}, }, ) as server: with trace(workflow_name="Dynamic Travel Planner with Dappier MCP"): await run(server, city, num_days, travel_date) ``` ## 🧪 Run the Travel Planner Agent This block checks for the required `uvx` binary and launches the full async workflow. Make sure `uvx` is installed and available in your system path. ```python Python if __name__ == "__main__": if not shutil.which("uvx"): raise RuntimeError( "uvx is not installed. Please install it with `pip install uvx` or " "`curl -LsSf https://astral.sh/uv/install.sh | sh`." ) asyncio.run(main()) ``` ```json Enter the city for your travel: Tokyo Enter the number of days for the trip: 2 Enter the start date of travel: tomorrow ---------------------------------------- Running itinerary generation for Tokyo starting tomorrow for 2 days === Streaming Start === Here's a cozy and adventurous 2-day itinerary for your trip to Tokyo, tailored for April 10-11, 2025, with considerations for the weather, live events, and accommodation options. --- ### **Day 1: April 10, 2025 (Overcast & Showers)** **Morning:** - **Breakfast at a local café.** - Recommended: **Doutor Coffee Shop** for a hearty breakfast. - **Activity:** Visit **Ueno Park**. - A great place to enjoy flowering cherry blossoms and explore museums. - **Tip:** Bring an umbrella! Expect passing showers throughout the day. **Time: 9:00 AM – 12:00 PM** --- **Lunch:** - **Location:** **Ameyoko Market** - Explore various street foods. - Try some grilled fish skewers and taiyaki for dessert! **Time: 12:30 PM – 1:30 PM** --- **Afternoon:** - **Activity:** Visit the **National Museum of Nature and Science.** - Indoor activity, perfect for the rainy weather. - Engage with interactive exhibits about Japan’s flora, fauna, and technology. **Time: 2:00 PM – 4:30 PM** --- **Evening:** - **Dinner:** Enjoy dinner at **Ippudo Ramen** in Akihabara. - Experience delicious tonkotsu ramen, perfect for a rainy evening. **Time: 5:00 PM – 7:00 PM** --- **Live Event:** - **Explore local venues for music or comedy shows.** - While specific events aren’t listed for April 10, many venues often have surprise performances! --- **Accommodation:** - **Hotel Suggestion:** **HOTEL GROOVE SHINJUKU, A PARKROYAL Hotel** - Price: Approx. **$109/night** - Location: Central, near Shinjuku. Free cancellation available. --- ### **Day 2: April 11, 2025 (Pleasant & Partly Cloudy)** **Morning:** - **Breakfast:** Local breakfast at your hotel or **Sarabeth's** for a delightful brunch. - **Activity:** Visit **Meiji Shrine.** - A peaceful shrine surrounded by a beautiful forest—enjoy leisurely walks. **Time: 8:00 AM – 11:00 AM** --- **Lunch:** - **Location:** **Omotesando Hills** - Explore shops and cafés; grab a bite at the **Café Kitsuné.** **Time: 11:30 AM – 1:00 PM** --- **Afternoon:** - **Activity:** Shopping in **Harajuku** & explore **Takeshita Street.** - Discover quirky shops and stylish street fashion. **Time: 1:30 PM – 3:30 PM** --- **Evening:** - **Live Event:** Attend the **Music Bridge Tokyo 2025** - Check out a variety of performances happening in the evening! **Dinner:** - **Location:** Restaurant nearby the event venue. - Enjoy Tokyo-style sushi or izakaya dining. **Time: 5:00 PM – 8:00 PM** --- **Accommodation:** - **Option:** If staying longer, consider **Park Hotel Tokyo** or **Hotel Century Southern Tower** for convenience and comfort. ### **Weather Overview:** - **April 10:** 20°C (68°F), overcast with showers. 🌧️ - **April 11:** 20°C (68°F), pleasant and partly cloudy. 🌤️ ### **Travel Considerations:** - **Transport:** Use the Tokyo Metro or JR trains for easy commuting. - **Pack an umbrella** for Day 1 and wear comfortable shoes for walking. --- Enjoy your delightful Tokyo trip filled with unique experiences! 🗼✨ === Streaming Complete === ``` ## 🌟 Highlights This cookbook has guided you through setting up and running a dynamic travel planner using **OpenAI Agents** and the **Dappier MCP Server**. By connecting your agent to real-time tools via MCP, you’ve created an assistant capable of generating rich, up-to-date itineraries that adapt to weather, events, and deals. Key components of this workflow include: * **OpenAI Agents SDK**: A powerful toolkit that enables large language models to operate as autonomous agents, use tools, and execute multi-step workflows with memory and structured decision-making. * **Dappier MCP**: A Model Context Protocol server that connects your agents to real-time, rights-cleared, AI-powered tools such as live search, weather, stock data, and content recommendations. * **Dynamic Travel Planning**: A real-world use case where the assistant creates a multi-day itinerary using live weather, events, and hotel data sourced via MCP. This architecture can be adapted to other use cases requiring live data integration, intelligent tool use, and context-aware output generation using the Agents SDK and MCP ecosystem. # 🕵🏻 Building an AI-Powered Stock Analyser using OpenAI Agents SDK and Dappier Source: https://docs.dappier.com/cookbook/recipes/open-ai-agent-stock-analyser You can also check this cookbook in colab [here](https://colab.research.google.com/drive/1o2rzrxSaqmGuZu4BEBxRIbTX5tOtyetP?usp=sharing) ## Introduction This notebook provides a step-by-step guide to building an AI-powered Stock Market Analyzer using OpenAI's Agents SDK and Dappier. The analyzer generates real-time stock market insights based on user input, fetching financial news, market trends, breaking news alerts, and high-performing similar stocks dynamically. It then formulates a data-driven trading strategy, including entry/exit points, risk management, and investment recommendations. By leveraging Dappier’s real-time search and AI-driven insights, this stock market agent helps users make informed and strategic investment decisions with up-to-date market intelligence. 🚀 ## Watch the Video Guide If you prefer a visual walkthrough, check out the accompanying video guide below: