25
Using Next.js and Vercel to instantly load a data-heavy website
Article originally published on Tinloof.
A React application is JavaScript code that gets transformed into static HTML. This transformation is called "rendering".
Whenever you build a React application, you're inevitably making a decision on when to render it and you usually have 3 choices:
A while ago, we faced this scenario when building illuminem, an energy news aggregator that showcases thousands of posts daily.
In this article, we'll talk about the performance problems we faced and how we ended up leveraging Next.js and Vercel to solve them.
illuminem's architecture consists of a service that crawls RSS feeds and web pages for energy-related posts, categorizes them, and pushes them to a headless CMS called Sanity.
On the CMS, content managers create collections of these posts based on filters like "category".
For example, they can create a collection called "Renewables" and use the "category" filter to only include posts that match the "renewables" category:
The frontend is a Next.js application that fetches these collections and displays them as carousels.
Building a product is not easy because requirements change throughout the process, so we played it safe to make sure we can be flexible enough to handle these changes and reach the finish line ahead of time.
We were not sure how often we'd get new posts from the crawler, so we rendered most of our pages server-side.
We used getServerSideProps
to fetch pages data from the CMS at every request.
Here's a simplified example from the homepage:
export default function HomePageContainer({ data }) {
return (
<Layout>
<HomePage data={data} />
</Layout>
);
}
// Called on the server after each request
export async function getServerSideProps() {
try {
const data = await fetchHomeDataFromCMS();
return {
props: { data },
};
} catch (error) {
console.error("Error fetching homepage data", error);
}
}
By the time we were done, the crawler had been running for 2 months and we started to feel the heavy page load.
Even after limiting the number of posts per collection, each carousel could have hundreds of posts and most of our pages had dozens of carousels, so we're talking about thousands of posts per page.
On average, it took 5 seconds to load a page on a very good WiFi connection.
It was no surprise that our TTFB (Time to First Byte) was heavily impacted since every time a user visits a page:
- The server had to make a request with a huge query to the CMS.
- The CMS had to parse that query and form the response data.
- Once the server received a response from the CMS with thousands of posts, it had to render the React application before sending it to the browser.
Some of the pages were not making any requests in getServerSideProps
to get data before rendering. Next.js made these pages static by default.
But what if a page needs to fetch data before building?
Well, Next.js provides a getStaticProps
that allows to fetch the data and render the page at build time. This would create static pages that load instantly.
export default function HomePageContainer({ data }) {
return (
<Layout>
<HomePage data={data} />
</Layout>
);
}
// Called at build time
export async function getStaticProps() {
try {
const data = await fetchHomeDataFromCMS();
return {
props: { data },
};
} catch (error) {
console.error("Error fetching homepage data", error);
}
}
Unfortunately, most of the other pages could not be completely static. In fact, most of them have a "Most Trending" carousel to display the most viewed posts in the past 48 hours, so it had to be up-to-date with the actual views metrics.
If we fetch the data at build time, the "Most Trending" carousel wouldn't be updated until the next build.
At this point, we wondered: why not make these pages render client-side?
The server wouldn't have to make any heavy work querying data and rendering the page.
Instead, each carousel can make a request to fetch its collection of data and then render it.
The main advantage would be that the TTFB would drastically decrease, making the page reach the browser pretty fast.
However, knowing that each page has on average 12-15 carousels, that would result in 12-15 queries per page visit. Our CMS payment plan is based on the number of queries we make, so this would make us reach the limit in no time and would certainly blow up when illuminem picks up more users.
On top of that, what we gain in performance in the server is lost in the client. The page would reach the browser fast, but it will be mostly a bunch of spinners. Each carousel would yet have to make a request to get its data and then render it.
Because of these two reasons, client-side rendering was out of the table.
Note: if you prefer video format, here's a video explaining this section.
Next.js introduced incremental static regeneration in the 9.5 version release, making it possible to generate static pages at run-time.
We can now generate static pages at build time, which makes them load instantly.
But, how can we keep the "Most Trending" carousel content up-to-date?
Every time a user visits one of these pages, getStaticProps
is run by the Next.js server in the background.
When the result of getStaticProps
is different from the previous run because the CMS data changed, the stale page is replaced by an updated one.
The updated page is generated at run-time without affecting the user experience.
The best part is that we only had to set the revalidate
property to 3600
to revalidate the page every hour.
export default function HomePageContainer({ data }) {
return (
<Layout>
<HomePage data={data} />
</Layout>
);
}
// Called at build and run-time
export async function getStaticProps() {
try {
const data = await fetchHomeDataFromCMS();
return {
props: { data },
// Revalidates the page every hour
revalidate: 60 * 60,
};
} catch (error) {
console.error("Error fetching homepage data", error);
}
}
For pages that depend on a route parameter (e.g. /[category]
), we were able to generate a static page for each possible parameter by using the getStaticPaths
method:
import categories from "../categories";
export default function CategoryPageContainer({ data }) {
return (
<Layout>
<CategoryPage data={data} />
</Layout>
);
}
export async function getStaticProps({ params: { category } }) {
try {
const data = await fetchCategoryDataFromCMS(category);
return {
props: { data },
revalidate: 1,
};
} catch (error) {
console.error("Error fetching homepage data", error);
}
}
export async function getStaticPaths() {
const categories = await fetchCategoriesFromCMS();
return {
paths: categories.map((category) => ({
params: { category },
})),
};
}
Users can click on a post to see its details in a modal and share it on social media.
Each post modal has a URL and we could add the meta-data
tags required to show a card preview snippet on the social media platforms.
Unfortunately, when such URLs are shared, social media platforms could not get the right meta-data
tags since they are only added once the modal appears in the client.
To fix that, we generated at run-time a static page for each post.
Such pages only have the post modal rendered statically with the right meta-data.
The rest of the page is rendered client-side.
We then used the URLs of these pages when sharing on social media.
export default function PostPage({ postData }) {
const [homeData, setHomeData] = React.useState({});
React.useEffect(() => {
fetchHomeDataFromCMS().then(setHomeData);
}, []);
return (
<>
<Layout>{!homeData ? null : <HomePage data={homeData} />}</Layout>
<PostModal data={postData} />
</>
);
}
export async function getStaticProps({ params: { postId } }) {
const postData = await fetchPostDataFromCMS(postId);
try {
return {
props: { postData },
revalidate: 60 * 60,
};
} catch (error) {
console.error("Error fetching post data", error);
// Fallback to 404 page in case of error
return { notFound: true };
}
}
// Nothing is generated at build time
export async function getStaticPaths() {
return {
paths: [],
fallback: "blocking",
};
}
We set fallback
to blocking
in getStaticPaths
to only return the page once it has finished loading. You can read more about the other fallback
possibilities Next.js offers here.
The first request to such pages might be a bit slow, but all the following requests resolve immediately because their static version was already generated.
Social media platforms display now a proper snippet of the shared post because its required meta-data
tags are available immediately in the HTML response.
If you plan to build or need help building a product using Vercel and Next.js, get in touch.
25