We recently built the Story Licensing application using some of the most powerful development tools available. By offloading the non-core development functionality onto available API services, we were able to build this application faster than I ever could have imagined. I’m excited to share with you how we did it.
TL;DR
The Requirements
Every application starts with a goal and requirements. For this application, the goal was to acquire leads interested in purchasing the rights to repost online content from publications in the AMI Publications network. Our requirements:
- Stories (for our app, stories published on @ami sites) need to be quickly and easily searchable
- Stories need to indicate a ranking of publicly available social validation (Claps) to the interested lead
- Ability to capture the lead’s email (before the search action)
- Ability to capture the bid request data: story title, lead email, search term bid and send an email to the admin
- Ability to manage copy content by a content manager via a content management system
The Services
For this app we knew we didn’t want to build and manage *everything*. Instead, we wanted to offload non-core development to the best available services, only building what we need. Any web project with a time crunch (pretty much every project) would be well-advised to do this so you don’t unnecessarily reinvent functionality that already exists at a high level. So for this app we decided to use:
- Cosmic JS for content management, data storage, this is also where our Medium articles are stored using the Medium Backup Application.
- Algolia to search stories. It definitely satisfies our speedy search requirement (It’s FAST!)
- SendGrid for sending email notifications
- @ami for supplying the story library
So with all of these, we’ve got the best in class services for the respective functionality. And building our application would involve locking these services together.
Building the Search Functionality
For this app I knew I wanted to use React (mainly for personal preference). So in keeping with the “build only what you have to” theme, the natural framework choice for me was Next.js. Their value proposition is “React Applications Made Simple”. It makes it easy to get a React app up and running without a lot of boilerplate development while providing for a nice developer experience.
To integrate Algolia, I needed to implement the search bar into the React app. Luckily, they’ve got an excellent InstantSearch React Component, which allows for easy integration into React apps. Just set your
appId
, apiKey
and indexName
. You can find these in your Algolia dashboard.Now that I had the search bar implemented, I needed to add records to Algolia. To do this, I needed to add the Cosmic JS-saved Medium articles into Algolia. I created a script to add Objects from Cosmic JS into our Algolia search index.
Here’s
add-records.js
:require('dotenv').config()
const Cosmic = require('cosmicjs')
const api = Cosmic()
const async = require('async')
const algoliasearch = require('algoliasearch')
const client = algoliasearch(process.env.ALGOLIA_ACCOUNT, process.env.ALGOLIA_INDEX)
const index = client.initIndex('Stories')
const buckets = ['your-bucket-slug','another-bucket-slug'] // Add all Bucket slugs here
async.eachSeries(buckets, (bucket_slug, bucketEachCallback) => {
const bucket = api.bucket({ slug: bucket_slug })
addRecords()
let added_count = 0
const addRecords = (skip = 0) => {
console.log(skip)
const locals = {}
async.series([
callback => {
const objects = bucket.getObjects({
type: 'posts',
limit: 1000,
skip
}).then(data => {
const objects = data.objects
locals.objects = objects
locals.total = data.total
console.log('Total:', data.total)
console.log('Object Length', locals.objects.length)
callback()
}).catch(err => {
console.log(err)
})
},
() => {
async.eachSeries(locals.objects, (object, eachCallback) => {
// Save Algolia record
delete object.content
index.addObject(object, (err, content) => {
console.log('objectID=' + content.objectID)
console.log('ADD', object.slug)
added_count++
eachCallback()
})
}, () => {
console.log('Added ' + added_count, 'Total:' + locals.total)
if (added_count !== locals.total) {
addRecords(added_count)
} else {
bucketEachCallback()
console.log('All done FOR REAL!')
}
})
}
])
}
})
Next I needed to get claps for stories, because the Medium XML feed for stories (Example: https://hackernoon.com/feed) doesn’t include this vital information for our app. So to do this, I needed to create a worker script that could run daily, updating all of our story clap counts.
Here’s
get-claps.js
:require('dotenv').config()
const async = require('async')
const axios = require('axios')
const algoliasearch = require('algoliasearch')
const client = algoliasearch(process.env.ALGOLIA_ACCOUNT, process.env.ALGOLIA_INDEX)
const index = client.initIndex('Stories')
const getClaps = () => {
let hit_count = 0
index.browse('', {}, function browseDone(err, content) {
if (err) {
throw err
}
const hits = content.hits
async.eachSeries(hits, (hit, callback) => {
const medium_url = hit.metadata.medium_link
if (!medium_url)
return callback()
axios.get(medium_url).then(response => {
const str1 = '"totalClapCount":'
const str2 = ',"sectionCount'
const claps = Number(response.data.split(str1).pop().split(str2).shift())
index.partialUpdateObject({
claps: claps,
objectID: hit.objectID
}, function(err, content) {
if (err) throw err;
// console.log(medium_url, claps, hit.objectID)
callback()
});
}).catch(err => {
console.log(err)
callback()
})
}, () => {
hit_count = hit_count + content.hits.length
if (content.cursor) {
index.browseFrom(content.cursor, browseDone)
} else {
getClaps()
console.log('DONE!', hit_count)
}
})
})
}
getClaps()
This script does the following: Gets all of the saved records in Algolia, hits the @ami story URL, gets the JSON data, parses it for the clap count, and then updates the record in Algolia. This process can run in the background to get and save the latest claps count for each story. This will increase our record actions per day quota in Algolia, increasing our costs, but it’s worth it to get the latest social validation for each story.
With the search results coming from Algolia, the search term relevance gets priority, but we can also reorder the stories to show the highest clap count on top:
Saving the Lead Emails and Bid Requests
Since the goal of this app is to save bid requests for potential customers, we chose SendGrid for a reliable email service provider. I created two endpoints to save the lead email prior to search, and the bid request information after a story had been found.
Here’s
leads.js
and bids.js
:module.exports = function(req, res) {
const Cosmic = require('cosmicjs')
const api = Cosmic()
const bucket = api.bucket({
slug: 'app-bucket-slug',
write_key: process.env.COSMIC_WRITE_KEY
})
bucket.addObject({
title: 'Lead - ' + (new Date()),
type_slug: 'leads',
metafields: [{
title: 'Email',
key: 'email',
type: 'text',
value: req.body.email
}],
options: {
content_editor: false,
slug_input: false
}
}).then(data => {
res.json(data)
})
}
module.exports = async function(req, res) {
const Cosmic = require('cosmicjs')
const sgMail = require('@sendgrid/mail')
const async = require('async')
sgMail.setApiKey(process.env.SENDGRID_API_KEY)
const api = Cosmic()
const bucket = api.bucket({
slug: 'story-licensing',
write_key: process.env.COSMIC_WRITE_KEY
})
const bid = await bucket.addObject({
title: 'Bid - ' + (new Date()),
type_slug: 'bids',
metafields: [
{
title: 'Email',
key: 'email',
type: 'text',
value: req.body.email
},
{
title: 'Bid',
key: 'bid',
type: 'text',
value: req.body.bid
},
{
title: 'Post Title',
key: 'post_title',
type: 'text',
value: req.body.post_title
},
{
title: 'Post Link',
key: 'post_link',
type: 'text',
value: req.body.post_link
}
],
options: {
content_editor: false,
slug_input: false
}
})
async.series([
callback => {
// Send to Admin
const subject = 'A Bid has been received'
const html_body = '<div>A bid has been received for <strong>' + req.body.post_title + '</strong> for <strong>$' + req.body.bid + '</strong></div>'
const message = {
// sender info
from: {
email: req.body.email
},
// Comma separated list of recipients
to: process.env.BID_EMAIL,
// Subject of the message
subject: subject, //
// plaintext body
text: subject,
// HTML body
html: html_body,
}
sgMail.send(message)
.then(() => {
//Celebrate
callback()
})
.catch(error => {
// Log friendly error
console.error(error.toString())
res.json({ success: false })
})
},
() => {
// Send to Bidder
const subject = 'Your bid has been received'
const html_body = '<div>A bid has been received for <strong>' + req.body.post_title + '</strong> for <strong>$' + req.body.bid + '</strong>. Thank you.</div>'
const message = {
// sender info
from: {
email: process.env.BID_EMAIL,
name: 'Story Licensing'
},
// Comma separated list of recipients
to: req.body.email,
// Subject of the message
subject: subject, //
// plaintext body
text: subject,
// HTML body
html: html_body,
}
sgMail.send(message)
.then(() => {
// Celebrate
res.json({ success: true })
})
.catch(error => {
// Log friendly error
console.error(error.toString())
res.json({ success: false })
})
}
])
}
Notice for both of these actions (lead email and bid request), we are also storing the data in Cosmic JS. This has been done so we can use this data for admin record keeping. Cosmic JS now has this data available to view, query and deliver via the Cosmic API into any future applications.
In Conclusion
The finished app is now available to help you find high-quality content from the AMI Publications network to license to your website or blog. Check it out here.
Resources Used
I’m happy with the way this project came together. The development was fast thanks to using the best services and developer tools available (Algolia, SendGrid, Next.js and Cosmic JS) to deliver a fast and scalable application. Let me know your thoughts, join the conversation on Slack and follow Cosmic JS on Twitter.