Menu
NEW AGENT
MY AGENTS
ASSISTANTS
Step 1:
Automated Tour Sales Pipeline
1️⃣
Perfect output
- scan ALL
2️⃣ Add
output numbers
, then...
3️⃣ Add
Subagent Numbers
(work backwards
from output number!
)
4️⃣ Add
ACTUAL Skills
to subagent
✅ DONE..Copy x4 to Step 3...
SETTINGS
LOGOUT
What Shall We Build Next?
1
Describe
Describe your task
2
Refine
Refine the plan
3
SubAgents
Review all agents
4
Deploy
Deploy your agent
Sub Agent 1
Sub Agent 2
Sub Agent 3
Sub Agent 4
Sub Agent 5
Sub Agent 6
Sub Agent 7
Sub Agent 8
A) SUBAGENT SUMMARY “Market_Scout_Agent” harvests three live data-sources (Skyscanner inbound-demand, TripAdvisor forum chatter, Google Travel Trends) for five target countries, normalises the signals into one weighted metric, and exports a single JSON file (`luxury_segments.json`) every night at 02:00 AWST. B) FINAL TASK OUTPUT /data/luxury_segments.json – UTF-8, pretty-printed JSON array. Schema: [ { "country": "Japan", "topic": "Swan Valley Wine Tour", "interest_score": 0.83 // 0–1 float, 3-source weighted }, … (≤150 objects) ] C) SUBAGENT INPUT {tripadvisor_headers} – custom request headers / cookies for TripAdvisor {skyscanner_api_key} – bearer or key param usable in Skyscanner Demand API {google_trends_api_key} – key or cookie jar that allows Google Travel Trends calls D) SUBAGENT TASK SUMMARY {tripadvisor_headers}, {skyscanner_api_key}, {google_trends_api_key} → #223 Powerful LLM – assemble static config (country list & WA topic keywords) → #224 Oracle Ask – call Skyscanner “Browse-Stats” endpoint per country (arrivals = PER) → raw_demand_{country}.json → #226 Extract Structured Data From URL – scrape top 3 TripAdvisor forum pages per topic (using {tripadvisor_headers}) → trip_forum_{country}_{topic}.txt → #224 Oracle Ask – query Google Travel Trends 30-day index for “
Western Australia” filtered to each country → trends_{country}_{topic}.json → #223 Powerful LLM – normalise & weight (40 % Skyscanner, 35 % TripAdvisor post-count, 25 % Google Trends index), produce interest_score, build master array → #223 Powerful LLM – validate JSON schema & pretty-print → (final output) /data/luxury_segments.json E) SKILL-BY-SKILL FLOW 1. Build config list Skill #223 Prompt: “Return an array of the five target ISO-2 country codes (JP, CN, SG, DE, AU) and the eight WA tourism topics…” Output: config.txt (used internally) 2. Skyscanner demand fetch (looped, one per country) Skill #224 (can make authenticated HTTPS calls) Input example: “GET https://partners.api.skyscanner.net/apiservices/v3/stats/flights/browse?originCountry=
&destinationAirport=PER&apikey={skyscanner_api_key} … Please return JSON only.” Output: raw_demand_JP.json (five such files) 3. TripAdvisor forum scrape (loop country→topic) Skill #226 Input: URL: `https://www.tripadvisor.com.au/ShowForum-g255103-i531-Perth_Greater_Perth_Western_Australia.html` (or topic-specific URL) Instruction: “Using {tripadvisor_headers}, extract number_of_posts_last_30d and language of post.” Output: trip_forum_JP_Perth.txt (∼40 files) 4. Google Travel Trends pull (loop country→topic) Skill #224 Input: “Query Google Travel Trends API with key {google_trends_api_key} for keyword ‘
Perth’ over last 30 days, geo=
. Return JSON with daily index.” Output: trends_JP_Perth.json 5. Merge & score Skill #223 Prompt piped with all partial outputs: “Normalize each data source (min-max); weight 0.4/0.35/0.25; produce float 0–1. Return array of objects {country, topic, interest_score}.” Output: weighted_array.txt 6. Validate / export Skill #223 Prompt: “Take weighted_array.txt and output valid, pretty-printed JSON. Save filename /data/luxury_segments.json.” Output: /data/luxury_segments.json F) SILOS SILO 1 – Skyscanner Data Loop five countries → #224 Oracle Ask → raw_demand_*.json SILO 2 – TripAdvisor Signals Loop five×eight pages → #226 Extract Structured Data → trip_forum_*.txt SILO 3 – Google Trends Loop five×eight queries → #224 Oracle Ask → trends_*.json SILO 4 – Aggregation & Export Combine all partials → #223 (twice) → final JSON The above chain fulfils the Market_Scout_Agent spec, accepts the three required credentials, and reliably writes `/data/luxury_segments.json` every day at 02:00 AWST.
SubAgent #1 - Diagram
Expand Diagram
A) SUBAGENT SUMMARY “Lead_Gen_Agent” transforms the raw country-by-interest demand signals in luxury_segments.json into a de-duplicated, enriched and numerically-scored list of the 25 hottest individual travellers (or small-group planners) likely to book a Deluxe Chauffeured Cars tour. B) FINAL TASK OUTPUT /data/qualified_leads.csv – UTF-8 CSV (header + ≤25 rows) with the columns: lead_id, first_name, last_name, email, language, country, source_platform, matched_topic, last_post_date, clearbit_company, hunter_confidence, travel_window, composite_lead_score (0-100) C) SUBAGENT INPUT • {clearbit_api_key} – secret string • {hunter_api_key} – secret string • luxury_segments.json – path passed from Market_Scout_Agent (structured as [{country, topic, interest_score}, …]) E) SUBAGENT TASK SUMMARY (skill chain) 1. Parse demand segments → build search plan luxury_segments.json → #223 (Powerful LLM) Prompt: “Read JSON and output a table of max-3 high-interest {country, topic, keyword_variations}. Include matching hashtag ideas for Instagram.” Output: search_plan.txt 2. Harvest TripAdvisor threads for each {country+topic} a. #224 (Oracle Ask) – “Return top 3 TripAdvisor forum URLs from the past 6 months where travellers from
discuss
in Western Australia.” (loop ≤9 URLs) b. For every URL → #226 (Extract Structured Data) Instructions: “Extract reviewer_username, reviewer_home_country, review_date, review_text.” Output accumulates into raw_tripadvisor_leads.json 3. Harvest Instagram handles engaging with WA tourism content a. For each hashtag in search_plan.txt → #224 (Oracle Ask) – “List 10 most recent Instagram posts using #
where user location/bio shows
.” b. #226 on every post URL – “Extract instagram_handle, post_date, bio_text, location.” Output accumulates into raw_instagram_leads.json 4. Combine & deduplicate raw leads raw_tripadvisor_leads.json + raw_instagram_leads.json → #223 Prompt: “Merge, normalise field names, drop duplicates (same email/handle), keep platform flag.” Output: consolidated_leads.json 5. Enrich with Clearbit & Hunter (email, name, company) consolidated_leads.json → #223 Prompt contains step-by-step HTTP requests (using {clearbit_api_key} & {hunter_api_key}) to: • Clearbit Person API (by handle or domain) • Hunter Email Finder + Email Verifier LLM executes & returns enriched_leads.json (adds email, first_name, last_name, company, hunter_confidence) 6. Score the leads enriched_leads.json → #223 Prompt: “Create composite_lead_score (0-100) weighted: 40 % interest_score(from segment), 30 % recency (post ≤30 days), 20 % hunter_confidence, 10 % matched_topic relevance. Also derive language code (en, ja, zh, de) and travel_window if mentioned.” Output: scored_leads.json 7. Select best 25 & write CSV scored_leads.json → #185 (Write Text) Prompt: “Sort by composite_lead_score desc, keep top 25, output as CSV with headers exactly: lead_id, first_name, last_name, email, language, country, source_platform, matched_topic, last_post_date, clearbit_company, hunter_confidence, travel_window, composite_lead_score.” Output saved automatically as /data/qualified_leads.csv F) SILOS SILO 1: Search-Plan Builder luxury_segments.json → #223 → search_plan.txt SILO 2: TripAdvisor Miner search_plan.txt → #224 → #226 → raw_tripadvisor_leads.json SILO 3: Instagram Miner search_plan.txt → #224 → #226 → raw_instagram_leads.json SILO 4: Enrichment & Scoring (consolidated_leads.json →) #223 (Clearbit/Hunter calls) → #223 (Scoring) → scored_leads.json SILO 5: Output Formatter scored_leads.json → #185 → /data/qualified_leads.csv This detailed flow keeps every skill’s input/output contract intact while ensuring the subagent reliably delivers a high-quality, immediately usable qualified_leads.csv for the downstream Hyper_Personalization_Agent.
SubAgent #2 - Diagram
Expand Diagram
A) SUBAGENT SUMMARY Hyper_Personalization_Agent receives every new batch of qualified leads, looks-up live Rezdy inventory + local weather for the dates those leads are likely to travel, auto-writes a perfectly-localized HTML offer e-mail (with 15 % “book-within-24 h” promo) in the traveller’s language, sends it through the SMTP details supplied, and stores/updates a tracking row for each lead inside /data/email_log.csv (status = sent | opened | clicked). B) FINAL TASK OUTPUT /data/email_log.csv — UTF-8 CSV, pipe-delimited (“|”), one row per lead. Columns: lead_id | email | country | language | tour_code | send_datetime | open_datetime | click_datetime | promo_expires | rezdy_booking_link | email_subject | status C) SUBAGENT INPUT {rezdy_api_key} – API key with read/write permissions {weather_api_key} – key for 7-day forecast endpoint (e.g. Open-Meteo) {smtp_credentials} – host, port, user, pw, from-name [qualified-leads-csv] – produced by SubAgent 2, columns: lead_id, email, country, language, tour_interest, est_travel_date, lead_score E) SUBAGENT TASK SUMMARY 1. INITIALISE & READ LEADS [qualified-leads.csv] → #223 “Powerful LLM” to parse & iterate list. 2. FOR EACH LEAD – SILO 1 • TOUR MATCH & AVAILABILITY a. #223 – Decide best Rezdy productCode (map tour_interest → productCode). b. #226 “Extract Structured Data From URL” – URL: https://api.rezdy.com/v1/products/availability?productCode={productCode}&apiKey={rezdy_api_key}&startTime={est_travel_date}&endTime={est_travel_date} → returns JSON of sessions → keep first slot with seats > 0. c. #223 – Build final booking URL: https://deluxetours.rezdy.com/{productCode}?date={YYYY-MM-DD}&promo=15OFF 3. SILO 2 • WEATHER LOOK-UP #224 “Oracle Ask A Question” – Q: “Give a 7-day weather forecast (summary, max/min °C, chance of rain) for Perth, Western Australia starting {est_travel_date} in compact JSON.” → returns weather JSON. 4. SILO 3 • EMAIL GENERATION a. #190 “Write or rewrite text” – prompt: “Write an engaging HTML email (550-750 words) in {language}. Include: greeting using first name if present; 1-sentence hook about WA; Tour name + 3-bullet description; weather snippet; 15 % discount line (promo code ‘15OFF’); big ‘Book Now’ button pointing to {bookingURL}; footer with unsubscribe and licence TT12345-WA.” b. #223 – Wrap result in JSON {subject, html}. 5. SILO 4 • SEND & LOG a. INTERNAL-SMTP (built-in node, no skill number) – Inputs: {smtp_credentials}, to=email, subject, html. Returns send_status (OK / fail) + message-id. b. #223 – Compile /data/email_log.csv row with: lead_id | email | country | language | productCode | send_datetime | “” | “” | (send_datetime+24h) | bookingURL | subject | status=sent Append (or update if row exists). 6. WEBHOOK HANDLERS (passive listeners inside agent runtime, not explicit skills) • On open webhook: update open_datetime, status=open. • On click webhook: update click_datetime, status=clicked. • These updates are again written by #223 into /data/email_log.csv. F) SILOS SILO 1 – Tour Match & Availability Input: one lead row Steps: #223 (map interest→productCode) → #226 (availability) → #223 (bookingLink) SILO 2 – Weather Fetch Input: est_travel_date Step: #224 (compact JSON forecast) SILO 3 – Email Copy & HTML Input: bookingLink, weather JSON, language Steps: #190 (generate email HTML) → #223 (package JSON) SILO 4 – Dispatch & Tracking Input: email JSON, smtp creds Steps: INTERNAL-SMTP send → #223 (write/update email_log.csv) All silos execute concurrently for every lead, and the parent SubAgent instance processes all leads in the incoming batch in parallel.
SubAgent #3 - Diagram
Expand Flow
A) SUBAGENT SUMMARY Booking_Executor_Agent turns every “clicked” lead into a real-time Rezdy reservation (or a logged rejection) by re-validating inventory, enforcing pricing / guest / country rules, creating the booking through the Rezdy API, and writing the definitive bookings_log.csv line that contains the unique rezdy_booking_id. B) FINAL TASK OUTPUT /data/bookings_log.csv — UTF-8 CSV, pipe-delimited, rolling append, columns: timestamp|lead_email|lead_country|tour_code|tour_date|pax|price_aud|rezdy_booking_id|status(success|rejected)|reason_if_rejected C) SUBAGENT INPUT {rezdy_api_key} – private API key with “read+write” scope [qualified-leads.csv] – latest enriched lead table coming from SubAgent 2 [email-log.csv] – real-time log that holds “clicked” events emitted by SubAgent 3 & 5 D) SUBAGENT TASK SUMMARY (skill chain) 1. Detect click-events email-log.csv ➜ #223 Powerful LLM (prompt: “Read csv text and return JSON array of rows where status==clicked and booking_status field is empty”) ⏩ Output-1: JSON array clicked_leads 2. For each clicked lead (loop handled by orchestration engine) 2.1 Map lead → tour parameters Inputs: clicked_leads + qualified-leads.csv #223 Powerful LLM (prompt: “Match this lead email to qualified-leads row; return JSON {tour_code, tour_date, pax, est_price, country}”) ⏩ Output-2: lead_booking_request 2.2 Call Rezdy GET /availability #223 Powerful LLM (prompt builds URL: `https://api.rezdy.com/v1/products/{tour_code}/availability?startTime={tour_date}&apiKey={rezdy_api_key}`) ➜ #226 Extract Structured Data From 1x URL (input: the URL above, extract: availability, price) ⏩ Output-3: rezdy_availability_json 2.3 Validate business rules #223 Powerful LLM (prompt: “Using rezdy_availability_json and lead_booking_request, confirm: pax≤8, price≤5000, country∉[IR,KP,SY] and slot is open; respond PASS/FAIL with reason.”) ⏩ Output-4: validation_result 2.4 IF validation_result == PASS → create booking #223 Powerful LLM (prompt builds POST body: `{ "customer":{…}, "items":[{ "productCode":"{tour_code}", "quantity":{pax}, "startTime":"{tour_date}" }], "source":"AI-Agent" }` ) ➜ #223 Powerful LLM (same step issues HTTPS POST to `https://api.rezdy.com/v1/bookings/create?apiKey={rezdy_api_key}` with retry policy 5×20 s+jitter, capture JSON) ➜ #226 Extract Structured Data From 1x URL (parse POST response txt, pull rezdy_booking_id, final_price) status = success ELSE (FAIL) rezdy_booking_id = “” status = rejected 2.5 Compose CSV row #223 Powerful LLM (prompt: “Return single pipe-delimited CSV row with cols listed in section B using variables we now hold.”) ⏩ Output-5: csv_row 2.6 Append to /data/bookings_log.csv #223 Powerful LLM (prompt: “Append csv_row to the end of /data/bookings_log.csv; create file if not exists.”) 3. Return updated /data/bookings_log.csv as the subagent’s outward artefact. E) SILOS SILO A – Intake & Click Detection email-log.csv ➜ step 1 (extract clicked leads JSON) SILO B – Availability & Rule Enforcement lead_booking_request ➜ rezdy_availability_json ➜ validation_result SILO C – Booking & Logging On PASS ➜ create booking ➜ extract rezdy_booking_id On FAIL ➜ mark rejection ➜ compose csv_row ➜ append to bookings_log.csv ➜ final output This detailed flow keeps Booking_Executor_Agent laser-focused on turning intent into rezdy_booking_ids while honouring all business constraints and handing off an always-up-to-date bookings_log.csv to downstream agents.
4 Template & Links
Expand Flow
A) SUBAGENT SUMMARY “Urgency_Engine_Agent” scans the engagement & payment data every 30 minutes and automatically sends time-sensitive reminder / cart-recovery emails in the lead’s language, then writes each action back to email_log.csv. B) FINAL TASK OUTPUT /data/email_log.csv – updated in-place (UTF-8 CSV, same column order as original) with one new line per reminder sent: lead_id,email,type,timestamp_AWST,language,subject,rezdy_booking_id(if any),status(sent|fail|bounced),opens,clicks,conversion C) SUBAGENT INPUT {smtp_credentials} – JSON block {host,port,user,pw,from_name,from_email} [email-log-csv] – absolute file/URL to current email log [bookings-log-csv] – absolute file/URL to current bookings log E) SUBAGENT TASK SUMMARY 1. Fetch & parse the two CSV files a. [email-log-csv] → #226 Extract Structured Data From 1x URL → emailLog.json b. [bookings-log-csv] → #226 Extract Structured Data From 1x URL → bookingLog.json 2. Detect who needs an urgency touch-point • #223 Powerful LLM Prompt-to-Text Response Input: “Given emailLog.json & bookingLog.json, return two arrays: A) openNoClick24 → leads who opened ≥24 h ago but no click, no prior R1 B) openNoClick48 → leads who opened ≥48 h ago but no click, no prior R2 C) paymentAbandoned → leads with bookingLog.status=‘payment_started’ and started>15 min ago and not confirmed.” Output: JSON {openNoClick24:[…], openNoClick48:[…], paymentAbandoned:[…]} 3. Create personalised copy (multilingual) For each list element: → #190 Write or rewrite text based on instructions Prompt includes: lead.language, lead.tour_name, rezdy_booking_link, 15 % promo, appropriate template (R1, R2, or Complete-Your-Booking), and friendly WA tone. Output: {subject,html_body} 4. Send the emails (Internal “SendEmail()” routine – not an external skill – using {smtp_credentials}. Capture SMTP response code.) 5. Log the event → #185 Write Text (Or Copy) From Inputted Text Input: “Convert the following JSON of just-sent emails into CSV rows matching the master column order…” Output: csvRows.txt 6. Merge rows into master log • #223 Prompt-to-Text: “Append csvRows.txt to the tail of [email-log-csv] content, preserve header.” • Save result as /data/email_log.csv (overwrite original) F) SILOS SILO 1 – DATA INGEST & ANALYSIS [email-log-csv] → #226 (parse to JSON) → emailLog.json [bookings-log-csv] → #226 (parse to JSON) → bookingLog.json emailLog.json + bookingLog.json → #223 (rule engine) → targets.json SILO 2 – CONTENT GENERATION targets.json (loop each lead) → #190 (draft subject/body in proper language) → draftEmails.json SILO 3 – DISPATCH & LOGGING draftEmails.json + {smtp_credentials} → internal SendEmail() → sendResult.json → #185 (format new CSV rows) → csvRows.txt → #223 (merge rows into master) → /data/email_log.csv
5 Template & Links
Expand Flow
A) SUBAGENT SUMMARY “Experience_Personalizer_Agent” listens to every confirmed Rezdy booking, then automatically delivers the guest-journey emails (confirmation → T-24 reminder → post-tour thank-you) in the traveller’s language, attaches a printable PDF guide, personalises each touch-point with live weather and pickup details, and records every send/open/review event in a single logfile. B) FINAL TASK OUTPUT /data/post_experience_log.csv – UTF-8 CSV with one row per outbound message. Columns: rezdy_booking_id, guest_email, email_type, language, send_timestamp, delivery_status, open_ts, click_ts, review_submitted (yes/no), file_link_to_PDF (confirmation only). C) SUBAGENT INPUT • {smtp_credentials} – SMTP/Sendgrid keys for sending mail • {weather_api_key} – for live weather lookup (added here) • [bookings_log.csv] – produced by SubAgent 4 and appended in real time • Rezdy webhook payloads (booking.confirmed) E) SUBAGENT TASK SUMMARY ====== EVENT-DRIVEN SILO 1 : “Book-Confirm” ====== webhook(json) → #185 Write Text → #185 Write Text → system-mailer → append row to post_experience_log.csv 1. Trigger: Rezdy booking.confirmed webhook fires. 2. Parse webhook to extract rezdy_booking_id, tour_name, start_datetime, pickup, guest_name, email, language_code. 3. #185 – Generate the EMAIL BODY Input: “Create a polite confirmation email in {language_code}. Must contain tour_name, start_datetime (local), pickup instructions, licence TT12345-WA, contact phone.” Output: HTML/Plain-text email. 4. #185 – Generate the PDF CONTENT (tour guide, map link, checklist). Input: “Write a 2-page printable guide in {language_code} covering itinerary, what to bring, emergency contact, local tips.” Output: HTML string (system converts to PDF, returns file URL). 5. System e-mailer (native, not a numbered skill) sends message with PDF attachment to guest_email. Track message_id. 6. Append log row: email_type = “confirmation”. ====== TIMER SILO 2 : “T-24 Reminder” ====== cron job (runs hourly) → check bookings starting 24±0.5 h → #223 → #185 → mailer → log 1. Read [bookings_log.csv]; select tours whose start_datetime is 24 h in future & not yet reminded. 2. #223 – Get weather: “Using {weather_api_key}, fetch 24-h forecast for latitude/longitude in booking payload.” 3. #185 – Create reminder email (language_code). Include weather icon/summary & checklist headline bullets. 4. Send email; log row with email_type = “t24_reminder”. ====== TIMER SILO 3 : “Post-Tour Follow-up” ====== cron job (daily 03:00 AWST) → select tours ended >3 h & not yet thanked → #185 → mailer → log 1. Query bookings whose end_datetime passed by ≥3 h and thank_you_sent = false. 2. #185 – Draft thank-you email in language_code; include Google review link https://go.engagesmartservices.com/deluxe-chauffeured-cars and promo code “RETURN15”. 3. Send email. 4. Log row email_type = “post_tour_thanks”. Open & review tracking (pixels / link UID) writes back open_ts, click_ts, review_submitted columns asynchronously. F) SILOS (clear view) SILO 1 – Confirmation (immediate) Input: webhook → Steps 3-6 above → Output row SILO 2 – T-24 Reminder (scheduled) Input: time scan + [bookings_log.csv] → Steps 1-4 above → Output row SILO 3 – Post-Tour Thank-You (scheduled) Input: time scan + [bookings_log.csv] → Steps 1-4 above → Output row All three silos write to the common /data/post_experience_log.csv, which becomes the authoritative history of guest-journey communications and review collection.
6 Template & Links
Expand Flow
Templates & Links Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
7 Template & Links
Expand Flow
Questions & Research Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
8 Template & Links
Expand Flow
Templates & Links Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
9 Template & Links
Expand Flow
Templates & Links Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
10 Template & Links
Expand Flow
Questions & Research Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
11 Template & Links
Expand Flow
Templates & Links Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.
12 Template & Links
Expand Flow
Need To Start Afresh?
BACK TO REFINE
Tweaked & Good To Go?
PROCEED TO DEPLOY