ForRent.com
ForRent.com is a leading website for apartment hunters. The company aims to streamline the apartment leasing process for prospective tenants while providing high-quality leads for property managers.
While at ForRent.com I was tasked with making continual improvements to the user experience and utilized a wide variety of tools to accomplish this. Among others, I would frequently A/B Test, rapid prototype, and run both moderated and unmoderated user testing sessions.
Overview
User Testing
Prompt Highlights
When I joined the team, I was tasked with gathering as much insight into our users from as many different angles as possible.
One of the very first studies I conducted was a series of remote user tests. The goal was not just to find obvious pain points in the conversion funnel, but less obvious hurdles as well. While the “Whys’ aren’t always immediately evident, issues found while testing serve as great starting points for future experimentation and testing.
Search for a 1 bedroom apartment in Chicago, IL for less than $1,900.
The biggest challenge of an unmoderated user test is writing prompts that are detailed enough to prevent roadblocks, but vague enough to encourage free thought as a user naturally would. If the prompt becomes too prescriptive, the test misses out on realism.
Explore the website as you naturally would during an apartment hunt. Spend no more than 5 minutes on this task.
This question is another example of trying to inject realism and exploration into the website. It’s also a way to gather insights that might’ve otherwise been glazed over.
If this were not a study, would you have quit your apartment hunt at any point? Explain your answer.
This prompt provides an easy and stress-free way to begin talking about the negative aspects of the experience. Although the testers are repeatedly told we want unbiased and unfiltered reviews, many people lean toward positive comments.
Refine your search by filtering for apartments that have a “fitness center” and “hardwood floors”.
We had some data points that suggested a lack of interaction with certain filter groups, so I wanted to narrow down possible causes.
Example Result - Video Transcription / Notable Points
This is a detailed result for one specific user. Notable portions from the video and applicable notes include:
1:25 - Immediately looks to filter results
3:30 - Easily navigates through photo galleries and faceted search
5:58 - Seems to expect “Check Availability” to be something other than a guest card
6:15 - Understands users can “like” a property to save it to their account
6:30 - 7:30 Looks for a way to email the property, again probably doesn’t realize the functionality of “Check Availability”
7:45 - Loves that schools and maps are listed
9:40 - Raises questions about the completeness or validity of reviews. “Does lack of reviews mean zero stars?”
10:13 - Says the site is “clean”, but is annoyed by advertisements
Aggregated Results
I ran a total of 5 users through this test because this is widely considered the sweet spot for the “maximum benefit-cost ratio”. (https://www.nngroup.com/articles/how-many-test-users/). Each test took around 10 minutes to complete.
Highlights from other participants include:
Gets hung up on custom pricing input.
Notices the “Preferred” tag, but thinks these are highly rated apartments.
Wants full-screen images, not “tiny baby pictures”.
Notices that a property is outside of his set price range.
Unexpected behavior from “Special Cable Offers”. It seems he didn’t realize this was an external ad.
Scrolls past the custom price fields.
Believes search results auto-sort based on price.
Thinks the “Preferred” banner is used to show properties that match her filters.
Meaning of different colored properties on “Map View” is not clear.
User Studies - Takeaways
Transparency - There's nothing wrong with allowing advertisers to promote listings, but it needs to be clear to the user when this occurs.
“Preferred” is a confusing term to describe paid listings. This is even more baffling on the map when there are different colors with no provided legend.
Sorting, There is no default sort visible and no easy way to change sorting either. It’s better to just be honest as Hotels.com does if there are times when it’s not possible to respect all filters due to advertising. This makes it clear when the results aren’t 100% matching user input.
Call-To-Action Clarity - As shown in numerous tests, using the term “Check Availabilty” brings a connotation of a "live insight", when in reality ForRent is sending an email on the user's behalf to property management. This mislabeling became a subject of many future A/B tests since it was such a pivotal point in the conversion funnel.
AB Testing & Implemented Changes
This A/B test shows a simple text change from “Check Availability” to “Email Property”. This resulted in fewer initial clicks, but a higher rate of successful conversions further down the tunnel. These results hinted that the experimental term more closely aligned with user expectations. Ultimately, this verbiage switch was chosen and is still in use today years later.
Version A - Control
Version B