Anthony McLin

Leveraging opacity to improve UX on a list of blog posts

When I built the theme for this site, browsers did not have fast Javascript execution and redraw frames were universally slow. So my original vision for fading out a list of blog posts wasn’t really achievable while still maintaining a usable experience. Instead I compromised my design intent with sequential transparency effect as the blog posting list got long. Modern browser capabilities have resolved my original concerns, so follow along as I take you through the steps of implementing a blog list that fades out as the items scroll off screen.

First let’s cover the existing behavior. As the user scrolls further and further down the page, the items gradually are faded out. When the visitor hovers over an item, it is brought back to full opacity so that it is legible.

This interaction was designed around the assumption of keyboard and mouse, before the popularity of mobile browsing. Even on early mobile browsers (notably Safari on iOS) it was still a usable experience as a visitor’s first tap would trigger the CSS hover state, bringing the item into full opacity. However as mobile popularity has grown, that isn’t as reliable an interaction method, and it’s not very intuitive.

When written, I was optimizing for minimal CSS and Javascript, as rendering speed was a significant challenge. Since the CSS pseudo-selectors like nth-child were not widely supported, it was typical to leverage enumerated classes for being able to manage sequential effects. Drupal provides these enumerated classes out of the box, so the manually written CSS was fairly straightforward.

Incremental transparency using CSS classes

We start by setting the baseline style for the elements, and include the hover state:

section {
  opacity: 0.5; // Minimum transparency for items beyond the enumerated list
  transition: opacity 0.5s;
section:hover {
  opacity: 1; // Highlight item when user is interacting with it

Next we go through each enumerated class, making it slightly more transparent than the last:

.section-1 {
  opacity: 1;
.section-2 {
  opacity: 0.9;


.section-9 {
  opacity: 0.55;
.section-10 {
  opacity: 0.5;

It’s relatively straightforward, but tedious to write as each class needs to be defined individually. It's also painful to maintain. If you want to change the rate at which items fade, they all have to be rewritten. With the advent of CSS preprocessors, we can take advantage of loops to generate this same block of CSS using much less code.

Incremental transparency using SASS

Here is the same demo, but written in SASS:

$maxOpacity: 1.0;
$minOpacity: 0.5;
$steps: 10; // Total number of steps to reach max
section {
    opacity: $minOpacity;
    transition: opacity 0.5s;
    // Hover effect so visitors can see the
    // item they're interacting with
    &:hover {
        opacity: $maxOpacity;    
.section {
    // First item is maximum opacity
    &-1 {
        opacity: $maxOpacity;
    // Iterate through the itermediate items
    @for $i from 2 to ($steps + 1) {
        &-#{$i} {
            opacity: $maxOpacity - ($i * ($minOpacity/$steps));

See the Pen Vertical Transparency with classes by Anthony McLin (@amclin) on CodePen.

The output CSS is the same, but the intent of what’s going on is much clearer. But ideally we shouldn’t have to rely on having enumerated classes in the HTML. If we’re using a CMS, we have to rely on the CMS to provide enumeration. Handwritten HTML would get tedious. A static site generator would require extra functions to manage. A headless JS app would need extra state tracking to maintain the enumeration. All are unnecessary overhead that can be replaced by pseudo-selectors. Here’s the same example SASS, rewritten to use pseudo-selectors. You can see in the HTML we no longer have to define any classes:

See the Pen Vertical Transparency with pseudo-selectors by Anthony McLin (@amclin) on CodePen.

Better touch screen UX using Javascript

This certainly makes the code easier to maintain, but the user experience on touch devices still isn’t that great. It’s not obvious to the visitor that they need to tap on a item to bring it into the highlighted state, and there’s a good chance that when they do tap, they may inadvertently trigger one of the links or other tap-triggered behaviors inside of the items. So how can we leverage opacity to emphasize the primary content that’s onscreen at any given point?

The solution is the original design that I had intended for this site, but could not leverage at the time as Javascript execution and page rendering times were far too slow. Competition in the browser space triggered and arms race to produce fast rendering and JS execution engines, and so now it’s viable to take advantage of Javascript to detect if an item is the primary onscreen element.

We also can make sure to provide a safe fallback for visitors with disabled Javascript. First we’ll define the safe fallback and remove all the pseudo-selectors to dramatically simply and reduce the CSS:

$maxOpacity: 1.0;
$minOpacity: 0.5;
section {
	opacity: $maxOpacity; // Fallback until JS is running
	transition: 0.5s opacity;
	.js & {
		opacity: $minOpacity;

Next we’ll write a Javascript helper function that determines if an element is fully onscreen, or if it goes offscreen, is in the primary focus area. I’m using a 25% threshold here, so if the bottom of an element is in the top 25% of the viewport, that’s considered scrolling offscreen. Likewise if the top of a block is in the bottom 25% of the screen, I’m not considering it fully onscreen.

const screenMargin = 0.25; // Top and bottom threshold to trigger
 * Identifies if the element is within the defined display area
 * @param {Object}
 * @returns {Boolean}
const isOnScreen = (el) => {
	const elRects = el.getClientRects()[0];
	const winHeight = window.innerHeight;
	// Top of object is above bottom of viewport margin
	const topIsOnscreen =	( < (winHeight * (1 - screenMargin)));
	// Bottom of object is below the top of viewport margin
	const bottomIsOnscreen = (elRects.bottom > (winHeight * screenMargin));

	return (topIsOnscreen && bottomIsOnscreen);

Now to manage the opacity, I introduce a recursive function that leverages requestAnimationFrame() to update the opacity of each element. Each iteration loops through the list of elements, checks to see if the element is onscreen based on the previously defined function, and then sets the appropriate opacity.

const maxOpacity = 1; // Fully highlighted elements
const minOpacity = 0.5; // Defocused elements
const sections = document.querySelectorAll('section’);
 * Sets the correct opacity for all elements and
 * iterates to the next animation frame
const update = () => {
	sections.forEach( (section) => {
		var opacity = minOpacity;
		if(isOnScreen(section)) {
			 opacity = maxOpacity;
		} = opacity;


 * Starts the sequence
const init = () => {
	// Set CSS
	// Start the effects


Put it all together and add an init function to add the "js" class to the body and start the animation chain:

See the Pen Vertical Transparency - Fade on Scroll by Anthony McLin (@amclin) on CodePen.

Performance considerations

For this example I’ve chosen to keep the transition duration managed in CSS with a fixed value. To get a more precise effect, it would be possible to directly set the intermediate opacity values of each element inside the update() function instead. That would require more execution steps on each frame, which could become a performance concern.

Likewise, if there are many elements on the page (like on sites with infinite scrolling), looping through the entire list to do the calculations could become problematic. To counter this, I would consider setting a state on each object once they’re far away from the viewable area so that you don’t have to recalculate the positions and opacities. You only would need to find the items close to the viewable area, and everything before or after those threshold elements could be skipped.

Add new comment