Integrating Touch Support to Drag-and-Drop Interfaces

by
Tags: , , , ,
Category:

In the previous blog post, we went over how to use the HTML Drag-and-Drop interface, which is a well-supported web API that’s available to use across major web browsers. Here, we will go over how we can extend our previous demo to support touch interactions so that our application can be used across various devices but still provide the same functionality the user expects.

For demonstration purposes I will be simulating a virtual mobile device using the Chrome Developer Tools within the Chrome browser. With this limitation we can keep external setup to a minimum as possible.

How to Use the Examples

Throughout the article I will provide a link to a CodePen that has playable demos to supplement the concepts described here. The examples do require additional setup to enable touch controls. Here is what this looks like specifically for the Chrome browser:

  • Open Chrome Developer Tools: Option + Command/Ctrl + I or right click inside the browser window to show a context menu and select “Inspect”.
  • Enable Device Toolbar: Command/Ctrl + Shift + M or selecting the icon from the menu.*
  • Choose a device to see how the example looks across various devices. You can also stretch the corners as well.

* Support for other browsers may be limited. I have noticed on Chrome that touch controls are enabled automatically when the Device Toolbar is enabled but on other browsers there could be other steps needed. For example, on Firefox it is the “Responsive Design Mode” that we want to see here. Firefox, also explicitly forces you to select the “Enable touch simulation” to toggle touch support.

Why support Touch surfaces?

In today’s digital landscape, ensuring your web applications are fully accessible and functional across all devices is more crucial than ever. With a significant portion of internet traffic coming from mobile devices, integrating touch support into your web applications is not just an enhancement but a necessity. The first part of our blog series introduced the implementation of the Drag-and-Drop API for desktop browsers. However, as we extend this functionality in the second part here, we focus on touch surfaces, which are predominant in mobile devices. By adapting drag-and-drop capabilities to recognize touch events, we can offer a seamless, intuitive user experience that mirrors interactions users expect on their smartphones.

Why isn’t touch support enabled automatically in the previous example? The Drag and Drop API uses the DragEvent interface which in turn inherits its properties from the MouseEvent interface. On a mobile phone typically we don’t have access to a pointing device such as a mouse. For touch surfaces, such as the screen of a smartphone, we will need to listen for TouchEvents and explore how to extend our drag-and-drop demo to support these events via a “touch-drag” gesture, ensuring a seamless user experience across various devices.

Understanding Touch Events

Touch events are similar to mouse events but are specifically designed to interpret interactions from a finger or stylus input on a touch-sensitive surface such as a touchscreen or a trackpad.

The primary touch events we will focus on are:

  • touchstart: A touch point is placed on the touch surface.
  • touchmove: A touch point is moved along the touch surface.
  • touchend: A touch point is removed from the touch surface.

In this example, using the touch simulator, we demonstrate pressing down on the surface, moving the recorded touch point a certain distance, and then finally ending the touch by reducing the pressure until the contact of the touch point is no longer registered.*

* Simulating touch support requires additional setup. Please ensure all steps were followed in the ‘How to Use the Examples’ section.

Adapting a Drag-and-Drop Interface for Touch Input

To make our drag-and-drop setup touch compatible, we need to map TouchEvents to the corresponding mouse events used in the drag-and-drop API. In the following sections we go through the initial setup to hook into the touch event lifecycle, how to map specific touch events with their DragEvent counterparts to achieve the same user experience, and finally updating the view to show the final positions of our elements.

Initial Setup

Our journey begins with a basic drag-and-drop setup and by the end we will have the drag-and-drop with touch support demo that is shown below. The initial code listens for dragstart and dragend events on draggable elements and handles the dragover event on containers to dynamically decide where within the container the dragged element should be dropped. This setup works perfectly on desktops where mouse events are the primary mode of interaction.

const draggables = document.querySelectorAll('.draggable');
const containers = document.querySelectorAll('.container');

draggables.forEach(draggable => {
  draggable.addEventListener('dragstart', () => {
    draggable.classList.add('dragging');
  });
  draggable.addEventListener('dragend', () => {
    draggable.classList.remove('dragging');
  });
});
containers.forEach(container => {
  container.addEventListener('dragover', e => {
    e.preventDefault();
    const afterElement = getDragAfterElement(container, e.clientY);
    const draggable = document.querySelector('.dragging');
    if (afterElement == null) {
      container.appendChild(draggable);
    } else {
      container.insertBefore(draggable, afterElement);
    }
  });
});

Adding Touch Support

We extend our functionality to respond to touchstart, touchmove, and touchend events, which are the touch equivalents of the mousedown, mousemove, and mouseup events that help power the drag-and-drop demo.

  1. Initializing Touch Interactions: We start by listening to the touchstart event on the draggable elements. When a touch begins, we activate the dragging state and prevent the default action, which typically involves scrolling or zooming.
  2. Handling Touch Movement: During the touchmove event, we update the position of the draggable element in real-time, making it follow the user’s finger. Additionally, we show a placeholder within potential drop containers to indicate where the element will land if released.
  3. Completing the Touch: On touchend, we finalize the drag operation by moving the draggable element to its new position indicated by the placeholder and then clean up by removing the placeholder and resetting the styles.
  draggable.addEventListener("touchstart", (e) => {
    draggable.classList.add("dragging");
    activeElement = draggable;
    createPlaceholder();

    // stop scroll behavior during touch
    e.preventDefault();
  });

  draggable.addEventListener("touchmove", (e) => {
    const touch = e.touches[0];
    draggable.style.position = "absolute";
    draggable.style.left = `${touch.clientX}px`;
    draggable.style.top = `${touch.clientY}px`;
    draggable.style.width = "300px";
    const potentialContainer = document
      .elementFromPoint(touch.clientX, touch.clientY)
      .closest(".container");
    if (potentialContainer && placeholder) {
      const afterElement = getDragAfterElement(
        potentialContainer,
        touch.clientY
      );
      if (afterElement) {
        potentialContainer.insertBefore(placeholder, afterElement);
      } else {
        potentialContainer.appendChild(placeholder);
      }
    }
    e.preventDefault();
  });

  draggable.addEventListener("touchend", () => {
    draggable.classList.remove("dragging");
    if (activeElement && placeholder && placeholder.parentNode) {
      placeholder.parentNode.insertBefore(activeElement, placeholder);
      placeholder.remove();
      activeElement.style.position = "static";
      activeElement.style.left = "";
      activeElement.style.top = "";
      activeElement.style.width = "";
    }
    activeElement = null;
    placeholder = null;
  });

Here is what the demo looks like now. The first example is using drag events and the second example shows touch events.


Note on Testing

As we utilize touch simulators within Chrome Developer Tools in our development process, it’s important to remember that these tools, while incredibly useful, do not fully replicate the nuances of physical interactions on actual devices. Simulators help bridge the initial gap in the development cycle by providing a convenient and quick way to test touch interactions. However, to ensure that our applications provide the best user experience and function correctly across various devices, exhaustive testing on physical devices is crucial. Real-world testing allows us to observe and rectify issues related to touch sensitivity, gesture recognition, and overall user interface responsiveness, which might not be fully apparent in a simulated environment. Therefore, while simulators are beneficial for early testing, they should be complemented by comprehensive testing on actual hardware to ensure the highest quality of the final product.

Conclusion

By extending the drag-and-drop functionality to support touch events, you can enhance the mobile user experience of your web applications. This allows users on all types of devices to interact with your application in an intuitive and natural manner, making your application more accessible and user-friendly. This is definitely not everything that can be achieved. There are more advanced touch motions that can be handled such as multiple touch points on a surface or a pinch-and-zoom gesture to handle scaling of an element. I hope the examples provided can help as a starting point in your next application.

Additional Resources