Screenshot of an upload documents page on a laptop

Bank Green

How can we reduce the time it takes to rate a bank?

Bank Green's mission is to fight climate change by decarbonizing the banking industry

My role

Lead UX research & design

Tools

Figma, FigJam, Miro

TEAM

1 product designer (me)
1 head of engineering/co-founder
2 senior engineers

In order to speed up the bank rating process, engineers had developed an AI chatbot tool to help the ratings team. However no one was using the tool. With over 10,000 banks to rate an an average of 3.5 hours to complete a rating, we needed an intuitive solution the raters could use.

To address this, I led user research, updated flows, and collaborated closely with engineers to meet technical constraints and speed up the process from 2.5 hours to 1 hour to set the foundation for speeding up the process further with AI.

Image of grid of screenshots on light green background

Bank ratings power 2 strategies that pressure banks to defund fossil fuels

Our rating system scores banks on their sustainability by looking at criteria like their policies and funded projects.

01

Consumer campaigns

  • Inform people - how’s their bank doing on sustainability?
  • Empower people to move their money and/or pressure banks to divest from fossil fuels

02

Accreditation

  • Accreditation for green banks (like LEED does for green buildings) establishes consumer trust in banks commitment to divesting from fossil fuels
  • Provides Bank Green with a path for revenue

Raters use an inefficient bank rating process

Raters use 3 different tools to rate banks. The complete rating and review process to rate a bank is error-prone and takes around 3.5 hours

A screenshot of a spreadsheet with a green background.

Multiple spreadsheets to calculate bank ratings

A screenshot of a computer screen with a kanban style board of bank items

Nuclino to track rated status and store source documents

A computer screen with a list of items on it.

Internal dashboard to manage relevant bank data on our website

Multiple spreadsheets containing formulas and instructions are duplicated for different bank regions, making it difficult to track updates in methodology and instructions for different sets of banks.  The complicated process makes it difficult to train new raters.

Given that there are over 10,000 banks in just the Anglosphere, we need a more efficient way to rate banks.  

Problem statement

How might we simplify and speed up the bank rating process?

We set some goals to address our problems:

  • Simplify process by having a single touchpoint to rate banks, track status, and store sources
  • Reduce bank rating time by 80-90% (from 2.5 hours to <30 min)
  • Create self-explanatory process to reduce need for training new rater

Solution

Green Policy Evaluator (GPE)

GPE is an AI tool used to research and rate banks, potentially qualifying them for the Bank Green Alliance accreditation

Process

Phase 1: Migrating from spreadsheet to a web-app

Phase 1 violated several design heuristics, resulting in confusion from users

This project started before design was brought in, driven by engineers. Engineers migrated segments of the spreadsheet process to a low-code frontend in Retool, which addressed some pain points. However, there were still many points of improvement:

  • Inconsistent application of visibility of system status (NNG heuristic 1)
  • Included irrelevant information without clear visual hierarchy (NNG heuristic 8)
  • No feedback to help users recognize and recover from errors (NNG heuristic 9)
  • Lacked clear documentation, requiring a lot of hand-holding for new users (NNG heuristic 10)

Initial migration attempt

Screenshot of page with annotations on the side

Phase 1 research, evaluation, and redesign

As design joined in later in the process, I had to collaborate heavily to ensure that feedback was implemented properly.

Phase 1 evaluation and iterations

Image of webpage with annotations on the side

Feedback to use visual hierarchy to show association between database tool and relevant criteria section implemented

Image of webpage with annotations on the side

Reduced horizontal form navigation, however there were continued struggles to get design updates fully into development

Before and after redesign

Screenshot of an app created in Retool showing a data table on top half of screen and form section on bottom half of screen

Engineer version

The initial iterations left important information out of the initial viewport, and had a lot of visual clutter

A screenshot of a website page with a green background.

Redesign

I interviewed 3 raters and discussed with team members to clarify and simplify the interface

The redesign, based on user feedback and design evaluation, improves visual hierarchy, reduces visual clutter, and increases users' efficiency of use.

Implemented improvements:

Help and documentation

  • Moved instructions into prominent area for new raters to view when needed

Visibility of system status

  • Keep action buttons fixed at bottom - previously, users would miss the save button and lose changes

Aesthetic and minimalist design

  • Used navigation to display relevant bank information, eliminating unnecessary table view while a user rates banks
  • Consolidated horizontal tabs and corresponding summary items into a vertical navigation style list of rating criteria

Help users recognize, diagnose, and recover from errors

  • Added error messages and alerts to inform users of ways to recover

Phase 1 redesign outcomes

  • Built foundation for moving process to a single platform for full bank rating cycle
  • Reduced rating time from 2.5 to 1 hour
  • Eliminated calculation and formula errors 
  • Saved experienced raters time by reducing need for training (dependency)

Persisting issues after redesign

  • Raters still have to use 2 tools
  • Goal is <30 min, currently at 1 hour
background image of a grid of flow charts

User flows and technical constraints inform error handling

I started my process by drawing out user flows and getting feedback from users and engineers.  

Anticipating sticky points

My engineering background helped drive conversations, asking about asynchronous capability and error times, which helped me identify 2 main sticky points that would inform the flows and designs.

01

Asynchronous actions 

What can be done asynchronously?
How do we inform users when a process is completed?

02

Complex error handling

When should we inform users about errors?
➜  Identified 3 possible points in flow to handle errors

Designs

Translating engineering and user feedback to designs

Asynchronous flows
I determined that storing and processing documents can be done asynchronously, and designed to allow users to continue other actions on the app, such as updating rating criteria.Users can clearly see the progress of uploads when the upload modal is closed, and receive alerts when an upload process is terminated.

Upload errors include things like errors in user input, file format or size, and bad URL links.  These errors are handled within the upload modal.  This way, small errors won’t cause the longer document processing and storage upload actions to fail at a later time. 

A screenshot of a document upload page with a red X in the lower right corner.

Document processing errors are backend heavy and include errors like unreadable formats and failure to store to Azure.  These errors might occur 5-10 minutes after a user attempts to upload a document.

To address these, I used a persistent alert, along with an alerts page to track a history of failed uploads and actions to recover from errors after leaving and returning to app.

Flow chart of webpages leading to another

Outcomes

01

Reduced rating time

Raters reported spending about 1 hour to fully rate a bank, down from 2.5 hours (the final goal being 20-30 minutes) Save an extra ~5-10 minutes with new upload

02

Reduced from 3 tools to 1

Increase efficiency of use - the upload document feature eliminated the use of a third-party tool used to track sources, eliminating confusion and reducing time raters spent on remembering different logins and tools

03

Positive user feedback

New raters had fewer questions when rating a bank for the first timeUsers mentioned the new interface was easier to use

Next steps

  • The next phase will use AI to parse uploaded documents to pre-fill in criteria to speed up bank rating

Key learnings

01

New design constraints

Designing for Retool was a new experience.  I learned to ask about Retool’s limitations, which informed my decisions when it came to flows such as auto-save

02

Unique challenges

Something I enjoy about designing for Bank Green is the opportunity to be creative when solving problems, since there often isn’t direct precedent for me to get inspiration from.  I enjoyed learning about the unique role of the rater and being able to improve their daily workflow.

03

Engineer-driven process

Being part of a product and engineering team that’s led by an engineer means that it’s extremely critical for me as a designer to stay proactive when asking questions, planning ahead, and communicating with the team, otherwise users and I may find unexpected, confusing changes pop up!