3 Best Mobile Apps to Earn Some Extra Income

Original Source: https://www.hongkiat.com/blog/cash-rewarding-mobile-apps/

Most apps require you to spend money, but did you know there are apps that could help earn money instead? That’s right, doing some simple tasks like completing surveys and offers or even just…

Visit hongkiat.com for full content.

Powerful Image Analysis With Google Cloud Vision And Python

Original Source: https://www.smashingmagazine.com/2019/01/powerful-image-analysis-google-cloud-vision-python/

Powerful Image Analysis With Google Cloud Vision And Python

Powerful Image Analysis With Google Cloud Vision And Python

Bartosz Biskupski

2019-01-09T13:45:32+01:00
2019-01-09T17:16:57+00:00

Quite recently, I’ve built a web app to manage user’s personal expenses. Its main features are to scan shopping receipts and extract data for further processing. Google Vision API turned out to be a great tool to get a text from a photo. In this article, I will guide you through the development process with Python in a sample project.

If you’re a novice, don’t worry. You will only need a very basic knowledge of this programming language — with no other skills required.

Let’s get started, shall we?

Never Heard Of Google Cloud Vision?

It’s an API that allows developers to analyze the content of an image through extracted data. For this purpose, Google utilizes machine learning models trained on a large dataset of images. All of that is available with a single API request. The engine behind the API classifies images, detects objects, people’s faces, and recognizes printed words within images.

To give you an example, let’s bring up the well-liked Giphy. They’ve adopted the API to extract caption data from GIFs, what resulted in significant improvement in user experience. Another example is realtor.com, which uses the Vision API’s OCR to extract text from images of For Sale signs taken on a mobile app to provide more details on the property.

Machine Learning At A Glance

Let’s start with answering the question many of you have probably heard before — what is the Machine Learning?

The broad idea is to develop a programmable model that finds patterns in the data its given. The higher quality data you deliver and the better the design of the model you use, the smarter outcome will be produced. With ‘friendly machine learning’ (as Google calls their Machine Learning through API services), you can easily incorporate a chunk of Artificial Intelligence into your applications.

Recommended reading: Getting Started With Machine Learning

Ahoy! The hunt for shiny front-end & UX treasures has begun! Meet SmashingConf San Francisco 2019 ?? — a friendly conference on performance, refactoring, interface design patterns, animation and all the CSS/JS malarkey. Brad Frost, Sara Soueidan, Miriam Suzanne, Chris Coyier and many others. April 16–17. You can easily convince your boss, you know.

Check the speakers ↬

Smashing Cat, just preparing to do some magic stuff.

How To Get Started With Google Cloud

Let’s start with the registration to Google Cloud. Google requires authentication, but it’s simple and painless — you’ll only need to store a JSON file that’s including API key, which you can get directly from the Google Cloud Platform.

Download the file and add it’s path to environment variables:

export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/apikey.json

Alternatively, in development, you can support yourself with the from_serivce_account_json() method, which I’ll describe further in this article. To learn more about authentication, check out Cloud’s official documentation.

Google provides a Python package to deal with the API. Let’s add the latest version of google-cloud-vision==0.33 to your app. Time to code!

How To Combine Google Cloud Vision With Python

Firstly, let’s import classes from the library.

from google.cloud import vision
from google.cloud.vision import types

When that’s taken care of, now you’ll need an instance of a client. To do so, you’re going to use a text recognition feature.

client = vision.ImageAnnotatorClient()

If you won’t store your credentials in environment variables, at this stage you can add it directly to the client.

client = vision.ImageAnnotatorClient.from_service_account_file(
‘/path/to/apikey.json’
)

Assuming that you store images to be processed in a folder ‘images’ inside your project catalog, let’s open one of them.

Image of receipt that could be processed by Google Cloud Vision

An example of a simple receipt that could be processed by Google Cloud Vision. (Large preview)

image_to_open = ‘images/receipt.jpg’

with open(image_to_open, ‘rb’) as image_file:
content = image_file.read()

Next step is to create a Vision object, which will allow you to send a request to proceed with text recognition.

image = vision.types.Image(content=content)

text_response = client.text_detection(image=image)

The response consists of detected words stored as description keys, their location on the image, and a language prediction. For example, let’s take a closer look at the first word:

[

description: “SHOPPING”
bounding_poly {
vertices {
x: 1327
y: 1513
}
vertices {
x: 1789
y: 1345
}
vertices {
x: 1821
y: 1432
}
vertices {
x: 1359
y: 1600
}
}

]

As you can see, to filter text only, you need to get a description “on all the elements”. Luckily, with help comes Python’s powerful list comprehension.

texts = [text.description for text in text_response.text_annotations]

[‘SHOPPING STOREnREG 12-21n03:22 PMnCLERK 2n618n1 MISCn1 STUFFn$0.49n$7.99n$8.48n$0.74nSUBTOTALnTAXnTOTALnCASHn6n$9. 22n$10.00nCHANGEn$0.78nNO REFUNDSnNO EXCHANGESnNO RETURNSn’, ‘SHOPPING’, ‘STORE’, ‘REG’, ’12-21′, ’03:22′, ‘PM’, ‘CLERK’, ‘2’, ‘618’, ‘1’, ‘MISC’, ‘1’, ‘STUFF’, ‘$0.49’, ‘$7.99’, ‘$8.48’, ‘$0.74’, ‘SUBTOTAL’, ‘TAX’, ‘TOTAL’, ‘CASH’, ‘6’, ‘$9.’, ’22’, ‘$10.00’, ‘CHANGE’, ‘$0.78’, ‘NO’, ‘REFUNDS’, ‘NO’, ‘EXCHANGES’, ‘NO’, ‘RETURNS’]

If you look carefully, you can notice that the first element of the list contains all text detected in the image stored as a string, while the others are separated words. Let’s print it out.

print(texts[0])

SHOPPING STORE
REG 12-21
03:22 PM
CLERK 2
618
1 MISC
1 STUFF
$0.49
$7.99
$8.48
$0.74
SUBTOTAL
TAX
TOTAL
CASH
6
$9. 22
$10.00
CHANGE
$0.78
NO REFUNDS
NO EXCHANGES
NO RETURNS

Pretty accurate, right? And obviously quite useful, so let’s play more.

What Can You Get From Google Cloud Vision?

As I’ve mentioned above, Google Cloud Vision it’s not only about recognizing text, but also it lets you discover faces, landmarks, image properties, and web connections. With that in mind, let’s find out what it can tell you about web associations of the image.

web_response = client.web_detection(image=image)

Okay Google, do you actually know what is shown on the image you received?

web_content = web_response.web_detection
web_content.best_guess_labels
>>> [label: “Receipt”]

Good job, Google! It’s a receipt indeed. But let’s give you a bit more exercise — can you see anything else? How about more predictions expressed in percentage?

predictions = [
(entity.description, ‘{:.2%}’.format(entity.score))) for entity in web_content.web_entities
]

>>> [(‘Receipt’, ‘70.26%’), (‘Product design’, ‘64.24%’), (‘Money’, ‘56.54%’), (‘Shopping’, ‘55.86%’), (‘Design’, ‘54.62%’), (‘Brand’, ‘54.01%’), (‘Font’, ‘53.20%’), (‘Product’, ‘51.55%’), (‘Image’, ‘38.82%’)]

Lots of valuable insights, well done, my almighty friend! Can you also find out where the image comes from and whether it has any copies?

web_content.full_matching_images
>>> [
url: “http://www.rcapitalassociates.com/wp-content/uploads/2018/03/receipts.jpg”,
url:”https://media.istockphoto.com/photos/shopping-receipt-picture-id901964616?k=6&m=901964616&s=612×612&w=0&h=RmFpYy9uDazil1H9aXkkrAOlCb0lQ-bHaFpdpl76o9A=”,
url: “https://www.pakstat.com.au/site/assets/files/1172/shutterstock_573065707.500×500.jpg”
]

I’m impressed. Thanks, Google! But one is not enough, can you please give me three examples of similar images?

web_content.visually_similar_images[:3]
>>>[
url: “https://thumbs.dreamstime.com/z/shopping-receipt-paper-sales-isolated-white-background-85651861.jpg”,
url: “https://thumbs.dreamstime.com/b/grocery-receipt-23403878.jpg”,
url:”https://image.shutterstock.com/image-photo/closeup-grocery-shopping-receipt-260nw-95237158.jpg”
]

Sweet! Well done.

Is There Really An Artificial Intelligence In Google Cloud Vision?

As you can see in the image below, dealing with receipts can get a bit emotional.

Man screaming and looking stressed while holding a long receipt

An example of stress you can experience while getting a receipt. (Large preview)

Let’s have a look at what the Vision API can tell you about this photo.

image_to_open = ‘images/face.jpg’

with open(image_to_open, ‘rb’) as image_file:
content = image_file.read()
image = vision.types.Image(content=content)

face_response = client.face_detection(image=image)
face_content = face_response.face_annotations

face_content[0].detection_confidence
>>> 0.5153166651725769

Not too bad, the algorithm is more than 50% sure that there is a face in the picture. But can you learn anything about the emotions behind it?

face_content[0]
>>> [

joy_likelihood: VERY_UNLIKELY
sorrow_likelihood: VERY_UNLIKELY
anger_likelihood: UNLIKELY
surprise_likelihood: POSSIBLE
under_exposed_likelihood: VERY_UNLIKELY
blurred_likelihood: VERY_UNLIKELY
headwear_likelihood: VERY_UNLIKELY

]

Surprisingly, with a simple command, you can check the likeliness of some basic emotions as well as headwear or photo properties.

When it comes to the detection of faces, I need to direct your attention to some of the potential issues you may encounter. You need to remember that you’re handing a photo over to a machine and although Google’s API utilizes models trained on huge datasets, it’s possible that it will return some unexpected and misleading results. Online you can find photos showing how easily artificial intelligence can be tricked when it comes to image analysis. Some of them can be found funny, but there is a fine line between innocent and offensive mistakes, especially when a mistake concerns a human face.

With no doubt, Google Cloud Vision is a robust tool. Moreover, it’s fun to work with. API’s REST architecture and the widely available Python package make it even more accessible for everyone, regardless of how advanced you are in Python development. Just imagine how significantly you can improve your app by utilizing its capabilities!

Recommended reading: Applications Of Machine Learning For Designers

How Can You Broaden Your Knowledge On Google Cloud Vision

The scope of possibilities to apply Google Cloud Vision service is practically endless. With Python Library available, you can utilize it in any project based on the language, whether it’s a web application or a scientific project. It can certainly help you bring out deeper interest in Machine Learning technologies.

Google documentation provides some great ideas on how to apply the Vision API features in practice as well as gives you the possibility to learn more about the Machine Learning. I especially recommend to check out the guide on how to build an advanced image search app.

One could say that what you’ve seen in this article is like magic. After all, who would’ve thought that a simple and easily accessible API is backed by such a powerful, scientific tool? All that’s left to do is write a few lines of code, unwind your imagination, and experience the boundless potential of image analysis.

Smashing Editorial
(rb, ra, il)

Popular Design News of the Week: December 31, 2018 – January 6, 2019

Original Source: https://www.webdesignerdepot.com/2019/01/popular-design-news-of-the-week-december-31-2018-january-6-2019/

Every week users submit a lot of interesting stuff on our sister site Webdesigner News, highlighting great content from around the web that can be of interest to web designers. 

The best way to keep track of all the great stories and news being posted is simply to check out the Webdesigner News site, however, in case you missed some here’s a quick and useful compilation of the most popular designer news that we curated from the past week.

Note that this is only a very small selection of the links that were posted, so don’t miss out and subscribe to our newsletter and follow the site daily for all the news.

8 Undoubtably True Predictions for UX in 2019

 

Design Style Guides to Learn from in 2019

 

A Collection of Great UI Designs

 

Site Design: Coding is Fun!

 

8 Examples of How to Effectively Break Out of the Grid

 

The 15 Coolest Interfaces of the Year

 

4 Useless Things You Shouldn’t Have Put in your Design Portfolio

 

Meet Twill: An Open Source CMS Toolkit for Laravel

 

The Grumpy Designer’s Bold Predictions for 2019

 

This is not User Experience

 

Branding Design – What You Need to Know Before Creating a Brand Identity

 

The Year that Was: 2018 in Web Design

 

Flat Design Vs. Traditional Design: Comparative Experimental Study

 

Users Don’t Read

 

Writing Copy for Landing Pages

 

The Elements of UI Engineering

 

Motion Design Looks Hard, but it Doesn’t Have to Be

 

Merge by UXPin

 

Responsive Design, and the Role of Development in Design

 

Material Design Colors Listed

 

Designing a Great User Onboarding Experience

 

How to Name UI Components

 

Is Design Valuable?

 

40+ Best Bootstrap Admin Templates of 2019

 

UI Design: Look Back at 12 Top Interface Design Trends in 2018

 

Want more? No problem! Keep track of top design news from around the web with Webdesigner News.

Add Realistic Chalk and Sketch Lettering Effects with Sketch’it – only $5!

Source

p img {display:inline-block; margin-right:10px;}
.alignleft {float:left;}
p.showcase {clear:both;}
body#browserfriendly p, body#podcast p, div#emailbody p{margin:0;}

How To Design Search For Your Mobile App

Original Source: https://www.smashingmagazine.com/2019/01/design-search-mobile-app/

How To Design Search For Your Mobile App

How To Design Search For Your Mobile App

Suzanne Scacca

2019-01-08T14:00:40+01:00
2019-01-08T17:46:05+00:00

Why is Google the search behemoth it is today? Part of the reason is because of how it’s transformed our ability to search for answers.

Think about something as simple as looking up the definition of a word. 20 years ago, you would’ve had to pull your dictionary off the shelf to find an answer to your query. Now, you open your phone or turn on your computer, type or speak the word, and get an answer in no time at all and with little effort on your part.

This form of digital shortcutting doesn’t just exist on search engines like Google. Mobile apps now have self-contained search functions as well.

Is a search bar even necessary in a mobile app interface or is it overkill? Let’s take a look at why the search bar element is important for the mobile app experience. Then, we’ll look at a number of ways to design search based on the context of the query and the function of the app.

Using The Web With A Screen Reader

Did you know that VoiceOver makes up 11.7% of desktop screen reader users and rises to 69% of screen reader users on mobile? It’s important to know what sort of first-hand difficulties visually impaired users face and what web developers can do to help. Read article →

Our new book, in which Alla Kholmatova explores
how to create effective and maintainable design systems to design great digital products. Meet Design Systems, with common traps, gotchas and the lessons Alla has learned over the years.

Table of Contents →

Mobile App Search Is Non-Negotiable

The search bar has been a standard part of websites for years, but statistics show that it isn’t always viewed as a necessity by users. This data from Neil Patel and Kissmetrics focuses on the perception and usage of the search bar on e-commerce websites:

Kissmetrics site search infographic

Data from a Kissmetrics infographic about site search. (Source: Kissmetrics) (Large preview)

As you can see, 60% of surveyed users prefer using navigation instead of search while 47% opt for filterable “search” over regular search functionality.

On a desktop website, this makes sense. When a menu is well-designed and well-labeled — no matter how extensive it may be — it’s quite easy to use. Add to that advanced filtering options, and I can see why website visitors would prefer that to search.

But mobile app users are a different breed. They go to mobile apps for different reasons than they do websites. In sum, they want a faster, concentrated, and more convenient experience. However, since smartphone screens have limited space, it’s not really feasible to include an expansive menu or set of filters to aid in the navigation of an app.

This is why mobile apps need a search bar.

You’re going to find a lot of use for search in mobile apps:

Content-driven apps like newspapers, publishing platforms, and blogs;
e-Commerce shops with large inventories and categorization of those inventories;
Productivity apps that contain documents, calendars, and other searchable records;
Listing sites that connect users to the right hotel, restaurant, itinerary, item for sale, apartment for rent, and so on;
Dating and networking apps that connect users with vast quantities of “matches”.

There are plenty more reasons why you’d need to use a search bar on your mobile app, but I’m going to let the examples below speak for themselves.

Ways To Design Search For Your Mobile App

I’m going to break down this next section into two categories:

How to design the physical search element in your mobile app,
How to design the search bar and its results within the context of the app.

1. Designing The Physical Search Element

There are a number of points to consider when it comes to the physical presence of your app search element:

Top Or Bottom?

Shashank Sahay explains why there are two places where the search element appears on a mobile app:

1. Full-width bar at the top of the app.
This is for apps that are driven by search. Most of the time, users open the app with the express purpose of conducting a search.

Facebook app search

Facebook prioritizes app search by placing it at the top. (Source: Facebook) (Large preview)

Facebook is a good example. Although Facebook users most likely do engage with the news feed in the app, I have a sneaking suspicion that Facebook’s data indicates that the search function is more commonly engaged with — at least in terms of first steps. Hence, why it’s placed at the top of the app.

2. A tab in the bottom-aligned navigation bar.
This is for apps that utilize search as an enhancement to the primary experience of using the app’s main features.

Let’s contrast Facebook against one of its sister properties: Instagram. Unlike Facebook, Instagram is a very simple social media app. Users follow other accounts and get glimpses into the content they share through full-screen story updates as well as from inside their endless-scroll news feed.

Instagram app search

Instagram places its search function in the bottom navigation bar. (Source: Instagram) (Large preview)

With that said, the search function does exist in the navigation bar so that users can look up other accounts to peruse through or follow.

As far as this basic breakdown goes, Sahay is right about how placement of search correlates with intention. But the designing of the search element goes beyond just where it’s placed on the app.

Shallow Or Deep?

There will be times when a mobile app would benefit from a search function deep within the app experience.

You’ll see this sort of thing quite often in e-commerce apps like Bed Bath & Beyond:

Bed Bath & Beyond app search

Bed Bath & Beyond uses deep search to help users find nearby stores (Source: Bed Bath & Beyond) (Large preview)

In this example, this search function exists outside of the standard product search on the main landing page. Results for this kind of search are also displayed in a unique way which is reflective of the purpose of the search:

Bed Bath & Beyond map search results

Bed Bath & Beyond displays search results on a map. (Source: Bed Bath & Beyond) (Large preview)

There are other ways you use might need to use “deep” search functions on e-commerce apps.

Think about stores that have loads of comments attached to each product. If your users want to zero in on what other consumers had to say about a product (for example, if a camping tent is waterproof), the search function would help them quickly get to reviews containing specific keywords.

You’ll also see deep searches planted within travel and entertainment apps like Hotels.com:

Hotels.com app search

Hotels.com includes a deep search to narrow down results by property name. (Source: Hotels.com) (Large preview)

You’re all probably familiar with the basic search function that goes with any travel-related app. You enter the details of your trip and it pulls up the most relevant results in a list or map format. That’s what this screenshot is of.

However, see where it says “Property Name” next to the magnifying glass? This is a search function within a search function. And the only things users can search for here are actual hotel property names.

Bar, Tab, Or Magnifying Glass?

This brings me to my next design point: how to know which design element to represent the search function with.

You’ve already seen clear reasons to use a full search bar over placing a tab in the navigation bar. But how about a miniaturized magnifying glass?

Here’s an example of how this is used in the YouTube mobile app:

YouTube app search icon

YouTube uses a magnifying glass to represent its search function. (Source: YouTube) (Large preview)

The way I see it, the magnifying glass is the search design element you’d use when:

One of the primary reasons users come to the app is to do a search,
And it competes against another primary use case.

In this case, YouTube needs the mini-magnifying glass because it serves two types of users:

Users that come to the app to search for videos.
Users that come to the app to upload their own videos.

To conserve space, links to both exist within the header of the YouTube app. If you have competing priorities within your app, consider doing the same.

“Search” Or Give A Hint?

One other thing to think about when designing search for mobile apps is the text inside the search box. To decide this, you have to ask yourself:

“Will my users know what sort of stuff they can look up with this search function?”

In most cases they will, but it might be best to include hint text inside the search bar just to make sure you’re not adding unnecessary friction. Here’s what I mean by that:

This is the app for Airbnb:

Airbnb app search text

Airbnb offers hint text to guide users to more accurate search results. (Source: Airbnb) (Large preview)

The search bar tells me to “Try ‘Costa de Valencia’”. It’s not necessarily an explicit suggestion. It’s more helping me figure out how I can use this search bar to research places to stay on an upcoming trip.

For users that are new to Airbnb, this would be a helpful tip. They might come to the site thinking it’s like Hotels.com that enables users to look up things like flights and car rentals. Airbnb, instead, is all about providing lodging and experiences, so this search text is a good way to guide users in the right direction and keep them from receiving a “Sorry, there are no results that match your query” response.

2. Designing The Search Bar And Results In Context

Figuring out where to place the search element is one point to consider. Now, you have to think about how to present the results to your mobile app users:

Simple Search

This is the most basic of the search functions you can offer. Users type their query into the search bar. Relevant results appear below. In other words, you leave it up to your users to know what they’re searching for and to enter it correctly.

When a relevant query is entered, you can provide results in a number of ways.

For an app like Flipboard, results are displayed as trending hashtags:

Flipboard app search results

Flipboard displays search results as a list of hashtags. (Source: Flipboard) (Large preview)

It’s not the most common way you’d see search results displayed, but it makes sense in this particular context. What users are searching for are categories of content they want to see in their feed. These hashtagged categories allow users to choose high-level topics that are the most relevant to them.

ESPN has a more traditional basic search function:

ESPN app search results

ESPN has designed its search results in a traditional list. (Source: ESPN) (Large preview)

As you can see, ESPN provides a list of results that contain the keyword. There’s nothing more to it than that though. As you’ll see in the following examples, you can program your app search to more closely guide users to the results they want to see.

Filtered Search

According to the aforementioned Kissmetrics survey, advanced filtering is a popular search method among website users. If your mobile app has a lot of content or a vast inventory of products, consider adding filters to the end of your search function to improve the experience further. Your users are already familiar with the search technique. Plus, it’ll save you the trouble of having to add advancements to the search functionality itself.

Yelp has a nice example of this:

Yelp app search filters

Yelp users have filter options available after doing a search. (Source: Yelp) (Large preview)

In the search above, I originally looked for restaurants in my “Current Location”. Among the various filters displayed, I decided to add “Order Delivery” to my query. My search query then became:

Restaurants > Current Location > Delivery

This is really no different than using breadcrumbs on a website. In this case, you let users do the initial work by entering a search query. Then, you give them filters that allow them to narrow down their search further.

Again, this is another way to reduce the chances that users will encounter the “No results” response to their query. Because filters correlate to actual categories and segmentations that exist within the app, you can ensure they end up with valid search results every time.

e-Commerce websites are another good use case for filters. Here is how Wayfair does this:

Wayfair app search filters

Wayfair includes filters in search to help users narrow down results. (Source: Wayfair) (Large preview)

Wayfair’s list of search results is fairly standard for an e-commerce marketplace. The number of items are displayed, followed by a grid of matching product images and summary details.

Here’s the thing though: Wayfair has a massive inventory. It’s the same with other online marketplaces like Amazon and Zappos. So, when you tell users that their search query produced 2,975 items, you need a way to mitigate some of the overwhelm that may come with that.

By placing the Sort and Filter buttons directly beside the search result total, you’re encouraging users to do a little more work on their search query to ensure they get the best and most relevant results.

Predictive Search

Autocomplete is something your users are already familiar with. For apps that contain lots of content, utilizing this type of search functionality could be majorly helpful to your users.

For one, they already know how it works and so they won’t be surprised when related query suggestions appear before them. In addition, autocomplete offers a sort of personalization. As you gather more data on a user as well as the kinds of searches they conduct, autocomplete anticipates their needs and provides a shortcut to the desired content.

Pinterest is a social media app that people use to aggregate content they’re interested in and to seek out inspiration for pretty much anything they’re doing in life:

Pinterest app search autocomplete

Pinterest anticipates users’ search queries and provides autocomplete shortcuts. (Source: Pinterest) (Large preview)

Take a look at the search results above. Can you tell what I’ve been thinking about lately? The first is how I’m going to decorate my new apartment. The second is my next tattoo. And despite only typing out the word “Small”, Pinterest immediately knew what’s been top-of-mind with me as of recent. That doesn’t necessarily mean I as a user came to the app with that specific intention today… but it’s nice to see that personalized touch as I engage with the search bar.

Another app I engage with a lot is the Apple Photos app:

Apple Photos app search

Apple Photos uses autocomplete to help users find the most relevant photos. (Source: Apple) (Large preview)

In addition to using it to store all of my personal photos, I use this on a regular basis to take screenshots for work (as I did in this article). As you can imagine, I have a lot of content saved to this app and it can be difficult finding what I need just by scrolling through my folders.

In the example above, I was trying to find a photo I had taken at Niagara Falls, but I couldn’t remember if I had labeled it as such. So, I typed in “water” and received some helpful autocomplete suggestions on “water”-related words as well as photos that fit the description.

I would also put “Recent Search” results into this bucket. Here’s an example from Uber:

Uber app recent search results

Uber’s recent search results provide one-click shortcuts to repeat users. (Source: Uber) (Large preview)

Before I even had a chance to type my search query in the Uber app, it displays my most recent search queries for me.

I think this would be especially useful for people who use ride-sharing services on a regular basis. Think about professionals who work in a city. Rather than own a car, they use Uber to transport to and from their office as well as client appointments. By providing a shortcut to recent trips in search results, the Uber app cuts down the time they spend booking a trip.

If you have enough data on your users and you have a way to anticipate their needs, autocomplete is a fantastic way to personalize search and improve the overall experience.

Limited Search

I think this time savings point is an important one to remember when designing search for mobile apps.

Unlike websites where longer times-on-page matter, that’s not always the case with mobile apps. Unless you’ve built a gaming or news app where users should spend lots of time engaging with the app on a daily basis, it’s not usually the amount of time spent inside the app that matters.

Your goal in building a mobile app is to retain users over longer periods, which means providing a meaningful experience while they’re inside it. A well-thought-out search function will greatly contribute to this as it gets users immediately to what they want to see, even if it means they leave the app just a few seconds later.

If you have an app that needs to get users in and out of it quickly, think about limiting search results as Ibotta has done:

Ibotta app search categories

Ibotta displays categories that users can search in. (Source: Ibotta) (Large preview)

While users certainly can enter any query they’d like, Ibotta makes it clear that the categories below are the only ones available to search from. This serves as both a reminder of what the app is capable of as well as a means for circumventing the search results that don’t matter to users.

Hotels.com also places limits on its search function:

Hotels.com limiting search results

Hotels.com forces users to make a choice so they don’t end up with too many results. (Source: Hotels.com) (Large preview)

As you can see here, users can’t just look for hotels throughout the country of Croatia. It’s just too broad of a search and one that Hotels.com shouldn’t have to provide. For one, it’s probably too taxing on the Hotels.com server to execute a query of that nature. Plus, it would provide a terrible experience for users. Imagine how many hotels would show up in that list of results.

By reining in what your users can search for and the results they can see, you can improve the overall experience while shortening the time it takes them to convert.

Wrapping Up

As you can see here, a search bar isn’t some throwaway design element. When your app promises a speedy and convenient experience to its users, a search bar can cut down on the time they have to spend inside it. It can also make the app a more valuable resource as it doesn’t require much work or effort to get to the desired content.

Smashing Editorial
(ra, yk, il)

10 Things to Quit Doing in 2019

Original Source: https://www.hongkiat.com/blog/things-to-quit-doing-2019/

Read the word “quit” and I’m almost certain you’re already imagining me telling you to leave it all behind, to start anew, to forget the past and other similar sounding advice…

Visit hongkiat.com for full content.

Incredible Reinterpretations of Picasso in 3D using Cinema 4D and Octane

Original Source: http://feedproxy.google.com/~r/abduzeedo/~3/hrPG1jsyW4s/incredible-reinterpretations-picasso-3d-using-cinema-4d-and-octane

Incredible Reinterpretations of Picasso in 3D using Cinema 4D and Octane
Incredible Reinterpretations of Picasso in 3D using Cinema 4D and Octane

abduzeedoJan 08, 2019

Construed MIMIC III is the 3rd installment of the series of studies of Picasso’s artwork and translation to 3D form using Maxon Cinema 4D and Otoy Octane. The project was created by Omar. Aqil and it’s been a great experience for him as he mentioned on his Behance post. “I have learned a lot from his work” – he adds. Omar this time picked six art pieces from Picasso’s work and try to recreate them in a different way. “I am trying to explore more complexity and abstraction of the shapes he used” – adds Omar. The result is a set of beautiful 3D artwork that brings a new dimension to the amazing work of Pablo Picasso. I am trying to explore more complexity and abstraction of the shapes he used

Omar is an art director, CGI, and illustrator currently working at CR Studio. He is based in Lahore, Pakistan and his portfolio includes much more amazing 3D work including the previous two series of artworks for the MIMIC series, the Atypical Portraits. MIMIC II and the original MIMIC. There are other incredible 3D projects in character design and typography, but the highlight for me is the abstract pieces. There are the Cubist Compositions that is also awesome to check out. As you can see, we highly recommend visiting Omar’s portfolio it’s truly inspiring.

Interpreting Picasso in 3D


Respectful UX: 5 Ways to Make Users Feel Valued

Original Source: https://www.webdesignerdepot.com/2019/01/respectful-ux-5-ways-to-make-users-feel-valued/

Every human being needs, rather desperately, to feel accepted. They don’t necessarily need to feel accepted by everyone, but they do need to feel accepted by someone. It’s a part of our nature, and nature in general. Even as I type this, my cat Cleocatra is demanding that I give her attention, and if I don’t, she’ll leave my room in a huff.

[A few minutes later:] She didn’t leave. She stuck her claws into me. Users can have very similar reactions if their needs aren’t met when interacting with your website. Humans will anthropomorphize anything, and if they feel like your website doesn’t accept them for who they are, they might just go find one that will. Or do the claw thing.

This is why users need to feel accepted and respected when they use your site. I’ve put together some ideas about how to achieve that. Don’t expect a list of fifty genders to put in your forms, I’m not the guy to ask about that; this is just usability 101:

1. Respect Their Time

Build your website or product to be as efficient as possible. When users leave because your website didn’t load in five seconds or less, it’s not because they’re entitled. It’s because they have stuff to do, and a finite amount of hours in the day to do it.

If your site asks them to jump through a bunch of hoops before they can even read your content, or get that free e-book, or what-have-you, you’re not respecting their time. If your sign-up form is too detailed and asks for too much information, you’re wasting their time.

Think of all the meetings you’ve had that could have been emails. If your site feels more like going to a meeting, they won’t want to go.

2. Avoid Assumptions & Judgment in all Content

People often hate it when you make assumptions about them. They hate it when those assumptions are wrong, and they especially hate it when the assumptions are right. What will tick anyone right off is when those assumptions come with judgment.

Goddammit Netflix. Yes, I’m still watching.

Marketing in general has operated on assumptions and judgment for years, promoting now-derided ideas about what it means to be male, female, a good parent, or a good person. If you haven’t noticed, these assumptions have been the subject of considerable debate for years, now, and many are even considered harmful.

When you make too many assumptions about who is, or should, be using your website, you risk running off users and customers. Now, why would you do that?

3. Encourage Trying Again

People make mistakes, and lots of them. It’s normal, it’s life, and it’s why we have form validation. Now, if your users are signing up for or buying something they feel they absolutely need, and can’t get anywhere else, they’ll stay. They’ll keep trying to fill out your form, or follow your app’s arcane process for doing things, or no matter what.

I know I’ve personally abandoned checkout processes because, in the end, I felt like the thing I kinda sorta wanted to buy wasn’t worth the effort to try to buy it twice. I figured that if they wanted my money so badly they wouldn’t have made it so hard for me to spend, and went on my merry way.

Having to re-fill entire forms is a particular pet peeve of mine. If there’s a mistake, let me go back and fix the one mistake I made. Do not make me refill entire sections of the form. That just makes me wish I hadn’t bothered in the first place. Encourage people to try again when a process fails by not making them redo more work.

You can also encourage them to try again by never making them feel dumb for screwing up in the first place. Error messages should be clear, gentle, and encouraging. Making users feel dumb is the fastest way to put them on the defensive, and defensive people don’t buy stuff. Make it clear that mistakes are just part of the process sometimes, and they are more likely to feel accepted.

4. Provide Clear Instructions

People are more likely to feel like they belong when they know what they’re “supposed” to do. Learning the rules of any peer group is the first step to feeling accepted in a new place.

Clear instructions can provide this confidence to users (and help avoid situations where they feel dumb). You can do this in your copy, in your micro-copy, or even with those animated walk-throughs that so many apps have nowadays. I’d even go further, and say to have illustrated instructions whenever you can.

Think of those credit card input forms that look like actual credit cards. They’re a perfect example of setting clear expectations and easily-understood requirements. Users that have a clear idea of how to do what they want to do on your site will feel like they belong there.

5. Make a Human Point of Contact Available

“Nah, you’re doing fine.”

“Don’t worry about it. We have it covered on this end.”

“No, that was our mistake, really.”

“No worries, you got everything right.”

“Okay, that is a problem, but we can fix it.”

We all need reassurance from time to time. It helps to have another human being tell us that everything’s going to be okay in the end, and that we didn’t irrevocably ruin things for everyone. Want people to feel accepted? Give them a point of contact with people who can tell them those things.

Even if it’s just an email address, people need a way to talk to another person. Ultimately, a website can only do so much to make people feel accepted and respected. Sometimes, you just need the human touch to make a human connection.

Above all, delivering a feeling of acceptance and respect to people is about recognizing and even appreciating their humanity. We don’t always get enough of that in a digital world.

 

Featured image via Depositphotos.

Add Realistic Chalk and Sketch Lettering Effects with Sketch’it – only $5!

Source

p img {display:inline-block; margin-right:10px;}
.alignleft {float:left;}
p.showcase {clear:both;}
body#browserfriendly p, body#podcast p, div#emailbody p{margin:0;}

15 Useful WordPress Functions All Developers Should Know

Original Source: https://www.hongkiat.com/blog/useful-wordpress-functions/

WordPress is full of great functions for us developers to use. We can pull post lists out of thin air, manipulate almost everything about them, grab any user we wish and display their social media…

Visit hongkiat.com for full content.

Animated Mesh Lines

Original Source: http://feedproxy.google.com/~r/tympanus/~3/Sd9FEXrcJ-s/

MeshLines_Featured

Two years ago, I started playing with lines in WebGL using THREE.MeshLine, a library made by Jaume Sanchez Elias for Three.js.

This library tackles the problem that you cannot handle the width of your lines with classic lines in Three.js. A MeshLine builds a strip of triangles billboarded to create a custom geometry instead of using the native WebGL GL_LINE method that does not support the width parameter.

These lines shaped as ribbons have a really interesting graphic style. They also have less vertices than a TubeGeometry usually used to create thick lines.

Animate a MeshLine

The only thing missing is the ability to animate lines without having to rebuild the geometry for each frame.
Based on what had already been started and how SVG Line animation works, I added three new parameters to MeshLineMaterial to visualize animated dashed line directly through the shader.

DashRatio: The ratio between what is visible or not (~0: more visible, ~1: less visible)
DashArray: The length of a dash and its space (0 == no dash)
DashOffset: The location where the first dash begins

Like with an SVG path, these parameters allow you to animate the entire traced line if they are correctly handled.

Here is a complete example of how to create and animate a MeshLine:

// Build an array of points
const segmentLength = 1;
const nbrOfPoints = 10;
const points = [];
for (let i = 0; i < nbrOfPoints; i++) {
points.push(i * segmentLength, 0, 0);
}

// Build the geometry
const line = new MeshLine();
line.setGeometry(points);
const geometry = line.geometry;

// Build the material with good parameters to animate it.
const material = new MeshLineMaterial({
lineWidth: 0.1,
color: new Color('#ff0000'),
dashArray: 2, // always has to be the double of the line
dashOffset: 0, // start the dash at zero
dashRatio: 0.75, // visible length range min: 0.99, max: 0.5
});

// Build the Mesh
const lineMesh = new Mesh(geometry, material);
lineMesh.position.x = -4.5;

// ! Assuming you have your own webgl engine to add meshes on scene and update them.
webgl.add(lineMesh);

// ! Call each frame
function update() {
// Check if the dash is out to stop animate it.
if (lineMesh.material.uniforms.dashOffset.value < -2) return;

// Decrement the dashOffset value to animate the path with the dash.
lineMesh.material.uniforms.dashOffset.value -= 0.01;
}

First animated MeshLine

Create your own line style

Now that you know how to animate lines, I will show you some tips on how to customize the shape of your lines.

Use SplineCurve or CatmullRomCurve3

These classes smooth an array of points that is roughly positioned. They are perfect to build curved and fluid lines and keep control of them (length, orientation, turbulences…).

For instance, let’s add some turbulences to our previous array of points:

const segmentLength = 1;
const nbrOfPoints = 10;
const points = [];
const turbulence = 0.5;
for (let i = 0; i < nbrOfPoints; i++) {
// ! We have to wrapped points into a THREE.Vector3 this time
points.push(new Vector3(
i * segmentLength,
(Math.random() * (turbulence * 2)) – turbulence,
(Math.random() * (turbulence * 2)) – turbulence,
));
}

Then, use one of these classes to smooth your array of lines before you create the geometry:

// 2D spline
// const linePoints = new Geometry().setFromPoints(new SplineCurve(points).getPoints(50));

// 3D spline
const linePoints = new Geometry().setFromPoints(new CatmullRomCurve3(points).getPoints(50));

const line = new MeshLine();
line.setGeometry(linePoints);
const geometry = line.geometry;

And like that you create your smooth curved line!

Animated MeshLine Curved

Note that SplineCurve only smoothes in 2D (x and y axis) compared to CatmullRomCurve3 that takes into account three axes.

I recommend to use the SplineCurve, anyway. It is more performant to calculate lines and is often enough to create the desired curved effect.

For instance, my demos Confetti and Energy are only made with the SplineCurve method:

AnimatedMeshLine - Confetti demo

AnimatedMeshLine - Energy demo

Use Raycasting

Another technique taken from a THREE.MeshLine example is using a Raycaster to scan a Mesh already present in the scene.

Thus, you can create your lines that follow the shape of an object:

const radius = 4;
const yMax = -4;
const points = [];
const origin = new Vector3();
const direction = new Vector3();
const raycaster = new Raycaster();

let y = 0;
let angle = 0;
// Start the scan
while (y < yMax) {
// Update the orientation and the position of the raycaster
y -= 0.1;
angle += 0.2;
origin.set(radius * Math.cos(angle), y, radius * Math.sin(angle));
direction.set(-origin.x, 0, -origin.z);
direction.normalize();
raycaster.set(origin, direction);

// Save the coordinates raycsted.
// !Assuming the raycaster cross the object in the scene each time
const intersect = raycaster.intersectObject(objectToRaycast, true);
if (intersect.length) {
points.push(
intersect[0].point.x,
intersect[0].point.y,
intersect[0].point.z,
);
}
}

This method is employed in the Boreal Sky demo. Here I used a sphere part as geometry to create the mesh objectToRaycast:

Boreal Sky - raycasting example

Now, you have enough tools to play and animate MeshLines. Many of these methods are inspired by the library’s examples. Feel free to explore these and share your own experiments and methods to create your own lines!

References and Credits

Three.js
THREE.MeshLine
THREE.MeshLine – Shape example
Gsap

Animated Mesh Lines was written by Jérémie Boulay and published on Codrops.

3 Essential Design Trends, January 2019

Original Source: https://www.webdesignerdepot.com/2019/01/3-essential-design-trends-january-2019/

New year, new design trends!

While everyone is talking about big-picture trends such as designing for voice and virtual reality, there are more immediate design elements that you can see (and deploy) right now for a more on-trend website.

From websites without images above the scroll, to ecommerce that disguises itself as content, to bright blue everything, here’s a look at what’s trending this month.

1. No “Art” Above the Scroll

Have you noticed how many websites don’t have images or video above the scroll? This no “art” design style used to be reserved for coming soon or construction pages that didn’t have images, but it’s trending even for website designs with plenty of other imagery.

If you have a message or statement that is the most important thing for users to know right away, this can be an effective design technique. It works because there’s nothing else to see. (Unless the user refuses to read the words and abandons the design, which can be a risk with this style.)

Make the most of a no art design with beautiful typography and strong color choices.

These design elements can serve as art on their own and help add visual interest to the words on the screen.

Each of the three examples below does this in a slightly different way.

We Are Crowd uses a strong serif-sans serif typography pair on a bright colored background. Users are enticed to delve into the design thanks to an animated scroller on the homepage. The no “art” design actually alternates between image and non-image panels, showing users there is something to look at.

Easys uses a simple serif in the center of the screen to draw the eye. Bright red text on a stark background draws you right to the lettering. There’s also a call to scroll, where the design fills with more color, images and interesting shapes.

David Pacheco’s portfolio takes a more minimal approach to the no art aesthetic. With simple, but oversized sans serif typography and a black background, there’s no way not to know what the website is about. The stark nature of the design makes the small, animated scroller at the bottom left more obvious, and it’s worth the effort with great images of his projects.

2. Bright Blue

It might not come as a surprise that a bright color is trending. But the hue might surprise you. Even though a bright coral was named color of the year by Pantone, bright blue is trending.

Bright blue backgrounds, text elements, overlays and more are everywhere. The color, which has roots in the Material Design palette, is rather cheerful and works with almost any content type. These are reasons why it might be so popular.

Blue is traditionally one of the most used colors among all website designs. Mostly because of associations that are harmonious, pleasing and trustworthy. This brighter blue adds to that with a somewhat lighter feel.

The best part of a website design trend that’s rooted in color is that it’s easy to use. Designers don’t have to overhaul a brand color palette to incorporate bright blue into the design. Sneak it in for accents or in images.

This is also a trend that you can add to a project quickly and without a lot of planning for a modern flair that can be deployed (and even removed) with minimal effort.

Like the blues you see in the examples below? Try these color mixes to replicate the hues:

Matt Downey uses a bright blue with an almost purple undertone; hex #263d83.

Output scrolls trough plenty of bright blue options on the homepage with more purple (hex #4d34d8) and brighter sky blue (hex #3a63d8) options.

Florent Biffi puts a new spin on navy with a brighter tone; hex #142877.

3. Ecommerce That Looks Like Content

Content is king…even when it comes to online sales.

Maybe as a design trend that’s a carryover from social media or maybe just to try something that’s a little more engaging, more ecommerce websites are deploying site architectures that look less like pages of items for sale and more like integrated content elements with a “buy it” button included.

This idea makes sales more about lifestyle and uses a product placement philosophy to sell items. This design concept can take a lot of time to plan out and design effectively, but it can be worthwhile if it resonates with a new customer or shopper base.

It’s likely that it takes a certain type of product as well. Ecommerce websites that use a lot of content and that look more like content that product sales or information tend to be lifestyle brands such as clothing or home textiles. This is because these items don’t need a lot of explanation–most shoppers know what size shirt they wear–and create a sense of urgency in the desire to buy. A customer sees the shirt and want it because of the associated sense of style or connection to brand.

It’s an interesting way to create a more authentic connection with users–a key marketing concept, particularly among millennial shoppers and audiences. It’s one of those design trends that has a lot of potential to grow, but only if designers and developers are willing to commit the time and resources to overhaul their ecommerce projects in this manner.

Conclusion

When it comes to using website design trends, the ones that come and go the quickest are the ones that are most deployable. Some of those–such as color–are on this list. Others, like ecommerce that looks like content, takes a lot more strategy to do well. Which type of trend are you most likely to try?

What trends are you loving (or hating) right now? I’d love to see some of the websites that you are fascinated with. Drop me a link on Twitter; I’d love to hear from you.

Add Realistic Chalk and Sketch Lettering Effects with Sketch’it – only $5!

Source

p img {display:inline-block; margin-right:10px;}
.alignleft {float:left;}
p.showcase {clear:both;}
body#browserfriendly p, body#podcast p, div#emailbody p{margin:0;}