6 Resources For JavaScript Code Snippets

Original Source: http://feedproxy.google.com/~r/1stwebdesigner/~3/vFCNSyyCZSg/

When it comes to writing JavaScript (or any other code, for that matter) you can save a lot of time by not trying to reinvent the wheel – or coding something that is common enough that it has already been written countless times. In these instances it is helpful to have a list of collections of commonly (or sometimes not so commonly) used scripts or snippets you can refer to or search through to find what you need to either get your code started or solve your whole problem.

That is why we’ve put together this list of collections of JavaScript snippets so you can bookmark and refer back to them whenever you are in need. Here are six useful resources for Javascript code snippets.

The UX Designer Toolbox

Unlimited Downloads: 500,000+ Wireframe & UX Templates, UI Kits & Design Assets
Starting at only $16.50 per month!


DOWNLOAD NOW

 

30 seconds of code

This JavaScript snippet collection contains a wide variety of ES6 helper functions. It includes helpers for dealing with primitives, arrays and objects, as well as algorithms, DOM manipulation functions and Node.js utilities.

30 seconds of code - JavaScript snippets

JavaScriptTutorial.net

This website has a section that provides you with handy functions for selecting, traversing, and manipulating DOM elements.

JavaScript Tutorial snippets

HTMLDOM.dev

This website focuses on scripts that manage HTML DOM with vanilla JavaScript, providing a nice collection of scripts that do just that.

HTMLDOM.dev

The Vanilla JavaScript Toolkit

Here’s a collection of native JavaScript methods, helper functions, libraries, boilerplates, and learning resources.

Vanilla JavaScript Toolkit

CSS Tricks

CSS Tricks has a nice collection of all different kinds of code snippets, including this great list of JS snippets.

CSS Tricks snippets

Top 100 JavaScript Snippets for Beginners

Focusing on beginners, and a bit dated, but this list is still a worthwhile resource to keep in your back pocket.

Topp 100 Scripts for beginners

We hope you find this list helpful in your future projects. Be sure to check out all of the other JavaScript resources we have here at 1stWebDesigner while you’re at it!


How to create a realistic 3D sculpture

Original Source: http://feedproxy.google.com/~r/CreativeBloq/~3/s1TxoNFeJN4/3d-sculpture

Discover the creative process for a 3D sculpture with photorealistic detail.

Creating a Typography Motion Trail Effect with Three.js

Original Source: http://feedproxy.google.com/~r/tympanus/~3/LHLmzKgp7-0/

Framebuffers are a key feature in WebGL when it comes to creating advanced graphical effects such as depth-of-field, bloom, film grain or various types of anti-aliasing and have already been covered in-depth here on Codrops. They allow us to “post-process” our scenes, applying different effects on them once rendered. But how exactly do they work?

By default, WebGL (and also Three.js and all other libraries built on top of it) render to the default framebuffer, which is the device screen. If you have used Three.js or any other WebGL framework before, you know that you create your mesh with the correct geometry and material, render it, and voilà, it’s visible on your screen.

However, we as developers can create new framebuffers besides the default one and explicitly instruct WebGL to render to them. By doing so, we render our scenes to image buffers in the video card’s memory instead of the device screen. Afterwards, we can treat these image buffers like regular textures and apply filters and effects before eventually rendering them to the device screen.

Here is a video breaking down the post-processing and effects in Metal Gear Solid 5: Phantom Pain that really brings home the idea. Notice how it starts by footage from the actual game rendered to the default framebuffer (device screen) and then breaks down how each framebuffer looks like. All of these framebuffers are composited together on each frame and the result is the final picture you see when playing the game:

So with the theory out of the way, let’s create a cool typography motion trail effect by rendering to a framebuffer!

Our skeleton app

Let’s render some 2D text to the default framebuffer, i.e. device screen, using threejs. Here is our boilerplate:

const LABEL_TEXT = 'ABC'

const clock = new THREE.Clock()
const scene = new THREE.Scene()

// Create a threejs renderer:
// 1. Size it correctly
// 2. Set default background color
// 3. Append it to the page
const renderer = new THREE.WebGLRenderer()
renderer.setClearColor(0x222222)
renderer.setClearAlpha(0)
renderer.setSize(innerWidth, innerHeight)
renderer.setPixelRatio(devicePixelRatio || 1)
document.body.appendChild(renderer.domElement)

// Create an orthographic camera that covers the entire screen
// 1. Position it correctly in the positive Z dimension
// 2. Orient it towards the scene center
const orthoCamera = new THREE.OrthographicCamera(
-innerWidth / 2,
innerWidth / 2,
innerHeight / 2,
-innerHeight / 2,
0.1,
10,
)
orthoCamera.position.set(0, 0, 1)
orthoCamera.lookAt(new THREE.Vector3(0, 0, 0))

// Create a plane geometry that spawns either the entire
// viewport height or width depending on which one is bigger
const labelMeshSize = innerWidth > innerHeight ? innerHeight : innerWidth
const labelGeometry = new THREE.PlaneBufferGeometry(
labelMeshSize,
labelMeshSize
)

// Programmaticaly create a texture that will hold the text
let labelTextureCanvas
{
// Canvas and corresponding context2d to be used for
// drawing the text
labelTextureCanvas = document.createElement('canvas')
const labelTextureCtx = labelTextureCanvas.getContext('2d')

// Dynamic texture size based on the device capabilities
const textureSize = Math.min(renderer.capabilities.maxTextureSize, 2048)
const relativeFontSize = 20
// Size our text canvas
labelTextureCanvas.width = textureSize
labelTextureCanvas.height = textureSize
labelTextureCtx.textAlign = 'center'
labelTextureCtx.textBaseline = 'middle'

// Dynamic font size based on the texture size
// (based on the device capabilities)
labelTextureCtx.font = `${relativeFontSize}px Helvetica`
const textWidth = labelTextureCtx.measureText(LABEL_TEXT).width
const widthDelta = labelTextureCanvas.width / textWidth
const fontSize = relativeFontSize * widthDelta
labelTextureCtx.font = `${fontSize}px Helvetica`
labelTextureCtx.fillStyle = 'white'
labelTextureCtx.fillText(LABEL_TEXT, labelTextureCanvas.width / 2, labelTextureCanvas.height / 2)
}
// Create a material with our programmaticaly created text
// texture as input
const labelMaterial = new THREE.MeshBasicMaterial({
map: new THREE.CanvasTexture(labelTextureCanvas),
transparent: true,
})

// Create a plane mesh, add it to the scene
const labelMesh = new THREE.Mesh(labelGeometry, labelMaterial)
scene.add(labelMesh)

// Start out animation render loop
renderer.setAnimationLoop(onAnimLoop)

function onAnimLoop() {
// On each new frame, render the scene to the default framebuffer
// (device screen)
renderer.render(scene, orthoCamera)
}

This code simply initialises a threejs scene, adds a 2D plane with a text texture to it and renders it to the default framebuffer (device screen). If we are execute it with threejs included in our project, we will get this:

See the Pen
Step 1: Render to default framebuffer by Georgi Nikoloff (@gbnikolov)
on CodePen.0

Again, we don’t explicitly specify otherwise, so we are rendering to the default framebuffer (device screen).

Now that we managed to render our scene to the device screen, let’s add a framebuffer (THEEE.WebGLRenderTarget) and render it to a texture in the video card memory.

Rendering to a framebuffer

Let’s start by creating a new framebuffer when we initialise our app:

const clock = new THREE.Clock()
const scene = new THREE.Scene()

// Create a new framebuffer we will use to render to
// the video card memory
const renderBufferA = new THREE.WebGLRenderTarget(
innerWidth * devicePixelRatio,
innerHeight * devicePixelRatio
)

// … rest of application

Now that we have created it, we must explicitly instruct threejs to render to it instead of the default framebuffer, i.e. device screen. We will do this in our program animation loop:

function onAnimLoop() {
// Explicitly set renderBufferA as the framebuffer to render to
renderer.setRenderTarget(renderBufferA)
// On each new frame, render the scene to renderBufferA
renderer.render(scene, orthoCamera)
}

And here is our result:

See the Pen
Step 2: Render to a framebuffer by Georgi Nikoloff (@gbnikolov)
on CodePen.0

As you can see, we are getting an empty screen, yet our program contains no errors – so what happened? Well, we are no longer rendering to the device screen, but another framebuffer! Our scene is being rendered to a texture in the video card memory, so that’s why we see the empty screen.

In order to display this generated texture containing our scene back to the default framebuffer (device screen), we need to create another 2D plane that will cover the entire screen of our app and pass the texture as material input to it.

First we will create a fullscreen 2D plane that will span the entire device screen:

// … rest of initialisation step

// Create a second scene that will hold our fullscreen plane
const postFXScene = new THREE.Scene()

// Create a plane geometry that covers the entire screen
const postFXGeometry = new THREE.PlaneBufferGeometry(innerWidth, innerHeight)

// Create a plane material that expects a sampler texture input
// We will pass our generated framebuffer texture to it
const postFXMaterial = new THREE.ShaderMaterial({
uniforms: {
sampler: { value: null },
},
// vertex shader will be in charge of positioning our plane correctly
vertexShader: `
varying vec2 v_uv;

void main () {
// Set the correct position of each plane vertex
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);

// Pass in the correct UVs to the fragment shader
v_uv = uv;
}
`,
fragmentShader: `
// Declare our texture input as a "sampler" variable
uniform sampler2D sampler;

// Consume the correct UVs from the vertex shader to use
// when displaying the generated texture
varying vec2 v_uv;

void main () {
// Sample the correct color from the generated texture
vec4 inputColor = texture2D(sampler, v_uv);
// Set the correct color of each pixel that makes up the plane
gl_FragColor = inputColor;
}
`
})
const postFXMesh = new THREE.Mesh(postFXGeometry, postFXMaterial)
postFXScene.add(postFXMesh)

// … animation loop code here, same as before

As you can see, we are creating a new scene that will hold our fullscreen plane. After creating it, we need to augment our animation loop to render the generated texture from the previous step to the fullscreen plane on our screen:

function onAnimLoop() {
// Explicitly set renderBufferA as the framebuffer to render to
renderer.setRenderTarget(renderBufferA)

// On each new frame, render the scene to renderBufferA
renderer.render(scene, orthoCamera)

// 👇
// Set the device screen as the framebuffer to render to
// In WebGL, framebuffer "null" corresponds to the default
// framebuffer!
renderer.setRenderTarget(null)

// 👇
// Assign the generated texture to the sampler variable used
// in the postFXMesh that covers the device screen
postFXMesh.material.uniforms.sampler.value = renderBufferA.texture

// 👇
// Render the postFX mesh to the default framebuffer
renderer.render(postFXScene, orthoCamera)
}

After including these snippets, we can see our scene once again rendered on the screen:

See the Pen
Step 3: Display the generated framebuffer on the device screen by Georgi Nikoloff (@gbnikolov)
on CodePen.0

Let’s recap the necessary steps needed to produce this image on our screen on each render loop:

Create renderTargetA framebuffer that will allow us to render to a separate texture in the users device video memoryCreate our “ABC” plane meshRender the “ABC” plane mesh to renderTargetA instead of the device screenCreate a separate fullscreen plane mesh that expects a texture as an input to its materialRender the fullscreen plane mesh back to the default framebuffer (device screen) using the generated texture created by rendering the “ABC” mesh to renderTargetA

Achieving the persistence effect by using two framebuffers

We don’t have much use of framebuffers if we are simply displaying them as they are to the device screen, as we do right now. Now that we have our setup ready, let’s actually do some cool post-processing.

First, we actually want to create yet another framebuffer – renderTargetB, and make sure it and renderTargetA are let variables, rather then consts. That’s because we will actually swap them at the end of each render so we can achieve framebuffer ping-ponging.

“Ping-ponging” in WebGl is a technique that alternates the use of a framebuffer as either input or output. It is a neat trick that allows for general purpose GPU computations and is used in effects such as gaussian blur, where in order to blur our scene we need to:

Render it to framebuffer A using a 2D plane and apply horizontal blur via the fragment shaderRender the result horizontally blurred image from step 1 to framebuffer B and apply vertical blur via the fragment shaderSwap framebuffer A and framebuffer BKeep repeating steps 1 to 3 and incrementally applying blur until desired gaussian blur radius is achieved.

Here is a small chart illustrating the steps needed to achieve ping-pong:

So with that in mind, we will render the contents of renderTargetA into renderTargetB using the postFXMesh we created and apply some special effect via the fragment shader.

Let’s kick things off by creating our renderTargetB:

let renderBufferA = new THREE.WebGLRenderTarget(
// …
)
// Create a second framebuffer
let renderBufferB = new THREE.WebGLRenderTarget(
innerWidth * devicePixelRatio,
innerHeight * devicePixelRatio
)

Next up, let’s augment our animation loop to actually do the ping-pong technique:

function onAnimLoop() {
// 👇
// Do not clear the contents of the canvas on each render
// In order to achieve our ping-pong effect, we must draw
// the new frame on top of the previous one!
renderer.autoClearColor = false

// 👇
// Explicitly set renderBufferA as the framebuffer to render to
renderer.setRenderTarget(renderBufferA)

// 👇
// Render the postFXScene to renderBufferA.
// This will contain our ping-pong accumulated texture
renderer.render(postFXScene, orthoCamera)

// 👇
// Render the original scene containing ABC again on top
renderer.render(scene, orthoCamera)

// Same as before
// …
// …

// 👇
// Ping-pong our framebuffers by swapping them
// at the end of each frame render
const temp = renderBufferA
renderBufferA = renderBufferB
renderBufferB = temp
}

If we are to render our scene again with these updated snippets, we will see no visual difference, even though we do in fact alternate between the two framebuffers to render it. That’s because, as it is right now, we do not apply any special effects in the fragment shader of our postFXMesh.

Let’s change our fragment shader like so:

// Sample the correct color from the generated texture
// 👇
// Notice how we now apply a slight 0.005 offset to our UVs when
// looking up the correct texture color

vec4 inputColor = texture2D(sampler, v_uv + vec2(0.005));
// Set the correct color of each pixel that makes up the plane
// 👇
// We fade out the color from the previous step to 97.5% of
// whatever it was before
gl_FragColor = vec4(inputColor * 0.975);

With these changes in place, here is our updated program:

See the Pen
Step 4: Create a second framebuffer and ping-pong between them by Georgi Nikoloff (@gbnikolov)
on CodePen.0

Let’s break down one frame render of our updated example:

We render renderTargetB result to renderTargetAWe render our “ABC” text to renderTargetA, compositing it on top of renderTargetB result in step 1 (we do not clear the contents of the canvas on new renders, because we set renderer.autoClearColor = false)We pass the generated renderTargetA texture to postFXMesh, apply a small offset vec2(0.002) to its UVs when looking up the texture color and fade it out a bit by multiplying the result by 0.975We render postFXMesh to the device screenWe swap renderTargetA with renderTargetB (ping-ponging)

For each new frame render, we will repeat steps 1 to 5. This way, the previous target framebuffer we rendered to will be used as an input to the current render and so on. You can clearly see this effect visually in the last demo – notice how as the ping-ponging progresses, more and more offset is being applied to the UVs and more and more the opacity fades out.

Applying simplex noise and mouse interaction

Now that we have implemented and can see the ping-pong technique working correctly, we can get creative and expand on it.

Instead of simply adding an offset in our fragment shader as before:

vec4 inputColor = texture2D(sampler, v_uv + vec2(0.005));

Let’s actually use simplex noise for more interesting visual result. We will also control the direction using our mouse position.

Here is our updated fragment shader:

// Pass in elapsed time since start of our program
uniform float time;

// Pass in normalised mouse position
// (-1 to 1 horizontally and vertically)
uniform vec2 mousePos;

// <Insert snoise function definition from the link above here>

// Calculate different offsets for x and y by using the UVs
// and different time offsets to the snoise method
float a = snoise(vec3(v_uv * 1.0, time * 0.1)) * 0.0032;
float b = snoise(vec3(v_uv * 1.0, time * 0.1 + 100.0)) * 0.0032;

// Add the snoise offset multiplied by the normalised mouse position
// to the UVs
vec4 inputColor = texture2D(sampler, v_uv + vec2(a, b) + mousePos * 0.005);

We also need to specify mousePos and time as inputs to our postFXMesh material shader:

const postFXMaterial = new THREE.ShaderMaterial({
uniforms: {
sampler: { value: null },
time: { value: 0 },
mousePos: { value: new THREE.Vector2(0, 0) }
},
// …
})

Finally let’s make sure we attach a mousemove event listener to our page and pass the updated normalised mouse coordinates from Javascript to our GLSL fragment shader:

// … initialisation step

// Attach mousemove event listener
document.addEventListener('mousemove', onMouseMove)

function onMouseMove (e) {
// Normalise horizontal mouse pos from -1 to 1
const x = (e.pageX / innerWidth) * 2 – 1

// Normalise vertical mouse pos from -1 to 1
const y = (1 – e.pageY / innerHeight) * 2 – 1

// Pass normalised mouse coordinates to fragment shader
postFXMesh.material.uniforms.mousePos.value.set(x, y)
}

// … animation loop

With these changes in place, here is our final result. Make sure to hover around it (you might have to wait a moment for everything to load):

See the Pen
Step 5: Perlin Noise and mouse interaction by Georgi Nikoloff (@gbnikolov)
on CodePen.0

Conclusion

Framebuffers are a powerful tool in WebGL that allows us to greatly enhance our scenes via post-processing and achieve all kinds of cool effects. Some techniques require more then one framebuffer as we saw and it is up to us as developers to mix and match them however we need to achieve our desired visuals.

I encourage you to experiment with the provided examples, try to render more elements, alternate the “ABC” text color between each renderTargetA and renderTargetB swap to achieve different color mixing, etc.

In the first demo, you can see a specific example of how this typography effect could be used and the second demo is a playground for you to try some different settings (just open the controls in the top right corner).

Further readings:

How to use post-processing in threejsFilmic effects in WebGLThreejs GPGPU flock simulation

The post Creating a Typography Motion Trail Effect with Three.js appeared first on Codrops.

10 Color Palette Generators & Tools For Your Web Design Projects

Original Source: http://feedproxy.google.com/~r/1stwebdesigner/~3/_bE29DwNgtI/

In today’s article we’ve rounded up ten of the best tools and websites to generate a color palette for your web design project. With some of these tools you can use a photo or image to generate your palette. Some use artificial intelligence. And some are completely unique, with communities that you can check out other designers’ submitted palettes as well as share your own. Let’s check them out!

Your Web Designer Toolbox
Unlimited Downloads: 500,000+ Web Templates, Icon Sets, Themes & Design Assets


DOWNLOAD NOW

 

Coolors

Create the perfect palette or get inspired by thousands of beautiful color schemes, available in the browser, iOS app, Adobe extension, and Chrome extension.

Coolors - color palette generator

Paletton

Whether you’re a professional designer, a starting artist or just a curious beginner in the world of art and design, Paletton is here to help you with all your color palette needs. You don’t need to know the ins and outs of color theory in order to use Paletton’s unique and easy color wheel. All you need to do is choose the basic color you are interested in exploring, and get inspired.

Paletton - color palette generator

Colormind

Colormind is a color scheme generator that uses deep learning AI. It can learn color styles from photographs, movies, and popular art. Different datasets are loaded each day.

Colormind - color palette generator

Adobe Color

With Adobe Color, you have access to the powerful harmonization engines for creating beautiful color themes to use in Adobe products. Start your color journey by exploring themes from the Color community. Be inspired by other creatives in curated Trend Galleries from Behance and Adobe Stock. Import photos and images to generate cohesive color palettes from your artwork.

Adobe Color - color palette generator

Canva Color Palette Generator

Want a color scheme that perfectly matches your favorite images? With Canva’s color palette generator, you can create color combinations in seconds. Simply upload a photo, and they’ll use the hues in the photo to create your palette.

Canva - color palette generator

COLOURlovers

COLOURlovers is a creative community where people from around the world create and share colors, palettes and patterns, discuss the latest trends and explore colorful articles… All in the spirit of love.

COLOURlovers - color palette generator

Color Hunt

Color Palettes for Designers and Artists. Discover the newest hand-picked palettes of Color Hunt, similar to Product Hunt, but for colors.

Color Hunt - color palette tool

Colordot

Colordot is “a color picker for humans”, using a unique interface. It also has an iOS app.

Colordot

Palettr

Generate fresh, new color palettes inspired by a theme or a place.

Palettr

Mudcube

Last in our collection is Mucube, a color wheel to generate unique color palettes that are downloadable in .AI and .ACO file formats.

Mudcube


Making A Strong Case For Accessibility

Original Source: https://smashingmagazine.com/2021/07/strong-case-for-accessibility/

Imagine yourself as someone with a visual disability. Cataracts, or totally blind even. A site is not accessible because of many factors, willing and unwillingly. Accessibility may have been attended to at the end of the project or not in the budget, or maybe they just didn’t practice it. You can’t access the vital information on the site because it’s not accessible to people with visual disabilities.

Maybe you suffer from a motor skill disability and there is a part of the site that you cannot access because there is an issue that prevents you from obtaining that information you wish to on a site that you want to buy something on.

These are just a couple of examples of what disabled users face daily when they try and access a website that is inaccessible. The case for accessibility is this; we as stakeholders, managers, teams, designers and developers need to do better in not only practicing accessibility but advocating for it as well.

If you have ever read the WebAIM Million report, you can see where the breakdown is, and until 2021, there hasn’t been much in the way of improvement. Of the over 51 million home pages that were tested, 51.4 errors per page. While the number of errors decreased, home page complexity increased regarding the number of page elements that had detectable accessibility errors.

Inclusive design is a way of creating digital products that are accessible to a wide range of people regardless of who they are, disabled or not, where they are, and encapsulates a diverse spectrum of people. The British Standards Institution (2005) defines inclusive design as:

“The design of mainstream products and/or services that are accessible to, and usable by, as many people as reasonably possible … without the need for special adaptation or specialized design.”

Practicing accessibility in your workflows and methodologies ensures people — disabled or not — that they can access your product, your website, and your brands. Inclusive design and accessibility go hand-in-hand. Let’s look at some examples of accessibility.

Examples Of Practical Digital Accessibility

Your site has a color scheme that looks great after the designers are done with the mock-ups or the color scheme your brand uses is “perfect”’ in the eyes of people that do not have a visual disability. For people that have a visual disability like glaucoma, cataracts, or tritanopia (the deficiency of blue in one’s vision) they cannot make out that particular color and it does not meet WCAG standards.
If your light blue font color on a darker blue background did not meet those guidelines set in WCAG, then it would be inaccessible. If you switched to a lighter blue or white font within the 4.5 to 1 ratio guideline that you can read about in (Success Criterion 1.4.3: Contrast (Minimum)), then it would be accessible.

You have a site that does not traverse well (or at all) when disabled users use assistive technologies such as a screen reader, keyboard, or voice recognition to navigate around the site because they lack the motor skills or dexterity to do so. An accessible site makes sure there is a way for those users to navigate through the site with no issues. Examples of keyboard navigation can be seen in this video and in this short video about voice recognition.
An accordion that uses items that when opened gives you some scrollable content, yet the content is not scrollable via keyboard. It is inaccessible to people that use the keyboard for navigation. An accessible site makes sure that the content is accessible via keyboard in that instance.
Images that do not have alternative text that describe to screen reader users what that image is or what message it is trying to convey is also important. If you have no alternative text, e.g.<img src=”” alt=”This is an image of my favorite kind of animal.”>

Users of newer technologies, people that live in metropolitan areas, or people that can afford to buy the latest tech, for example, have fewer barriers to break through than people that cannot afford a new phone or tablet, or are in very remote or rural areas.
The glare of the sun on your mobile device as you’re on the beach on a sunny day. You’re shielding the screen or squinting to see the screen to read what restaurant to go to after you’re day at the beach is done. That is a situational impairment.
A fussy child on your lap moving around while you are trying to read an email or an injured arm as you try to answer an email, hindering your ability to type as you normally would is a situational impairment.

There are a lot more examples of accessibility I could mention but that would take a very long time to write, so these are just a few examples.

Actionable Steps To Get Buy-In

Two of the most widely used reasons I hear as to why people or companies aren’t practicing accessibility are:

“The client has no budget for it.”
“My manager said we’ll get to it after launch.”

Buy-In And Support From Executives/Stakeholders

From the outset, advocating for accessibility starts at the top. Once you have stakeholder support, then you may see support trickle down to managers, then teams, and then individuals. It all starts with you first. Buy-in and support from executives will continuously be successful across the organization.

When the ethical approach doesn’t work, the approach I will take is the financial approach:

“You’ll be saving the company a lot of money when you do this from the start. When maintenance is needed, it won’t take the team as long to maintain the code because of accessibility and clean code.”

When, or if that fails, I’ll go for the legal ramifications and cite instances from the lawsuits that have been won against Target, Bank of America, Domino’s Pizza, and others. It’s amazing to see how fast executives and stakeholders do not want to get sued.

Keeping executives engaged and meeting with them regularly will ensure success with your accessibility initiatives, but will also provide support for when new accessibility initiatives need to be implemented or when there are disagreements among teams on the implementation or prioritization, you have the support of executives.

Being accessible is a good way for a company to differentiate itself from other companies, when you make a quality product, then the company buy-in becomes greater in some instances. Teams that push for accessibility usually lead the way to getting buy-in from other departments and executives. If the product is high quality and makes the company money, that’s when the company is swayed to adopt the practice.

Demonstrations of live testing with disabled users are also another way to get buy-in across the board. To humanize the decision-making process and get executives and colleagues on board by showing them what struggles are faced on a daily basis by disabled users using inaccessible products. Or one that I have used in the past, don’t ask, just do it.

Most of the time, however, it is how the practice of accessibility can make the company money or the legal consequences that a company can face, that sways an executive to adopt the practice. Then, in those instances, is when an executive or stakeholder starts learning about accessibility if they want to invest in that time.

Coordinating efforts across departments may be difficult and time-consuming at first so that support from the top will help alleviate the pressures and burnout that can happen when taking on the task of creating and implementing an accessibility strategy.

Have A Team Or Individual Who Is Your Accessibility Advocate(s)

Once you have buy-in from executives or stakeholders, having a person throughout each department or a team focused on accessibility. Throughout each department have an individual who is the liaison regarding accessibility.

Have someone that can answer questions and work with others to practice the guidelines and work with others to make accessible products. Help set up documentation and tooling, serve as an intermediary between departments.

Assess the Product and Proficiency Within the Company

Gauging the point where the product(s) are as far as how inclusive and accessible they are is a key priority. This ensures the team or individual that their efforts to make the product better and accessible are happening. What is the current state of the product? What is the current state of the website or mobile application?

Getting the general idea of the level of knowledge that teams and people in the company currently have is important going forward. How versed are they in accessibility guidelines and practices? Do they know anything about the Web Content Accessibility Guidelines (WCAG)? How much training do you have and will you need?

Maintaining a written record of all accessibility training done to meet the requirements that apply to your organization is a great way to keep data on all the training done within the organization. Recording the training and who and when it was completed. If there is no inter-organizational accessibility training available your organization can look into different methods of training like the kind WebAIM, the ADA, or Knowbility has to offer.

Establish Guidelines For The Company

Consistent implementation of the product greatly benefits the organization. It reduces the amount of work, which in turn can reduce the number of stress teams are under. Design systems should be used not only to ensure branding and consistency, but accessibility, inclusivity, and understanding of code better.

Accessible components help for obvious reasons and reduces the time it takes to implement, rather than start from nothing and try to re-invent something that has already been done. Testing procedures should be implemented so that departments do their jobs well and do them efficiently, especially QA and developers.

Documenting guidelines for your organization is as simple as creating a set of accessibility guidelines. You could internally document them in a collaborative software such as Notion or Dynalist; or an online documentation like Google Docs or Dropbox Paper. Somewhere that has a collaborative aspect where people can add to the documentation that the organization has.

Getting Fellow Colleagues To Buy-In

In this landscape of frameworks and libraries, “going fast and breaking things”, and overlooking and undervaluing accessibility, people need to be educated and that also needs to happen at the team level. The people that do not have voices, the “people on the other side of the glass” need YOU to be their voice.

As a freelancer, setting up meetings or training sessions to onboard organization members that you may be working with can be beneficial to all. Holding a workshop or webinar is also an option to training colleagues to buy-in as well.

Getting a team onboard because training brings the team together and people know the importance of accessibility, and they want to produce a quality product that people can use regardless of disability.

Pitching to those who do not or may not know that accessibility means less time spent working on what they work on, less stress and headaches, can sway a developer very fast from my experience.

Sharing The Importance Of The Rules

Whether you live in the EU, the UK, Canada, or the United States, most countries have rules regarding standards for accessibility. Familiarity with those rules and guidelines only ensures compliance on another level.

Whether it is the ADA (American Disabilities Act) or Section 508 (Government compliance) in the U.S., the ACT (Accessible Canada Act) in Canada, or EN 301 549 in the EU, sharing the importance of the guidelines can be crucial to getting departments, executives, and the organization as a whole on board.

Pick Examples From The Outside World As Use Cases

Test and record cases where a disabled user is trying to use the product, website, or mobile app. Showing colleagues and executives these tests and use cases will bolster the argument you have for implementing accessibility at your organization.

From there, you could get a source outside the organization that specializes in accessibility testing with disabled users, such as Applause, for instance. The organization and people within may turn around and embrace accessibility at the company and in the workflow.

Hire Disabled People

Whether it is internally or on a contract basis with an outside firm like Applause, there are the people with the lived experiences. They can benefit you and your company and team by having them aboard. These folks bring value to you and the organization.

Get executives and hiring managers on board to bring on people with disabilities to not only help with accessibility, but also teach and advocate for accessibility and inclusivity within the organization.

Best Practices For Maintaining Accessibility

Accessibility does not end at handoff or when the project is “finished” as is with the web, accessibility is ever-evolving and needs periodic checking when new features are implemented or changes are made to see if accessibility is still be practiced and adhered to.

Vigilance of the accessibility of the product(s) ensures a standard of accessibility. Automated testing of the product wherever possible that fits the strategy of the departments when it comes time to release new features or changes.

Any barriers that may arise (and they will) will be addressed and they can be handled in a manner that expedites the process and rolls out fixes for those barriers that take them down and makes the product accessible for those who need it.

Performing screen reader analysis before every release to ensure that users of screen readers and other assistive technologies can use the website or mobile app.

Annual audits and user testing is always important whether made internally by a team, or a third-party that specializes in accessibility auditing, especially with user testing by disabled users. What does that audit entail?

An executive summary for stakeholders that details the needs of the product so that it can become compliant as well as addressing the current state of affairs as well.
A developer report that details every possible path that can be taken through the website, mobile app, and product that addresses concerns and needs that will be encountered along the way.

Summary

Accessibility matters. It matters to those who are getting shut out on a daily basis from some form of digital creation they encounter. Whether it is by design or not, the fact that accessibility is an afterthought in most cases is a critical oversight we all have to correct.

Accessibility and accessible sites and apps make the web better, they make everyone feel included no matter what situation or disability. Inclusion and accessibility remove barriers for disabled people and accessibility and performance also make the web accessible for those that aren’t equipped with the latest and greatest phones or devices.

Getting on board with accessibility is something we should all do. Let’s remember the people on the other side of the glass. Accessibility is a right, not a privilege. Let’s make accessibility a priority.

10 Apps to Record Your Baby’s Milestones & Achievements

Original Source: https://www.hongkiat.com/blog/record-baby-daily-life/

Having a child is one of life’s great joys, or so I’m told. If you’re a new parent, it’s only natural that you’ll want to document and record your baby’s life….

Visit hongkiat.com for full content.

These might be the weirdest wireless headphones we've ever seen

Original Source: http://feedproxy.google.com/~r/CreativeBloq/~3/JqVzzeMQvnc/sony-neckband-headphones

Sony’s neckband is… interesting.

3 Essential Design Trends, July 2021

Original Source: https://www.webdesignerdepot.com/2021/07/3-essential-design-trends-july-2021/

This month, you will either love or hate the featured design trends.

The common theme among them is a strong design element that can create distinct emotional connections. They range from interesting monotone color choices to brutalist examples to AI-inspired faces and design elements.

Here’s what’s trending in design this month.

1. Interesting Monotone Color Palettes

Monotone color palettes aren’t something that we usually call a trending design theme because mono patterns are almost always in style. What makes these monotone website designs interesting is color choice.

The trend is to use a pretty unconventional color choice for monotone color palettes. For example, would you start the design process thinking of an all-mauve, canary yellow, or purple aesthetic?

For most designers, those probably aren’t the first choices. But, conversely, the outcome of those decisions is rather stunning in each of the examples below, whether you love the color choices or not.

What works (and what might fall short) with each of these trending examples:

Wookmama: This mauve color scheme might be the first one you’ve encountered? It uses varying hues that are pretty in-your-face. It works because the concept behind the website is to create custom color schemes. The challenge lies in contrast and that there’s not a lot of distinction between hues in the mono scheme.

BBC Storyworks: The deep purple color palette with pinkish highlights is bright and readable, despite the dark background. White text and elements with smooth animation bring out the regality of the color choice. The challenge with this color is that purple often has strong emotional associations for individuals (good and bad), and you don’t know what “baggage” users might bring to the design.

Yellow Pony: This design is incredibly bright and has some brutalist undertones. What makes this color choice work is that it stops you in your tracks. You can’t help but look at the bright yellow and oddly-colored pony. The challenge, like with Wookmama, is contrast. There’s also a lot going on here with the bright color.

 

 

2. Fairly Brutal Black and White

Brutalism and brutalist design themes seem to keep ebbing and flowing. Understandably, it seems like, as a whole, designers can’t quite decide how they feel about this overall visual theme.

This trio of fairly brutal designs shares more than starkness in technique. They also feature distinct black and white color schemes and animation.

Put it all together, and the overall theme is maybe more “fairly brutal” than straight brutalist, re-emphasizing the hesitancy with the trend.

What’s nice about each of these designs is that they feel special and content-focused. This is a little in contrast with some other brutalist designs that are so stark and harsh that it can be hard to figure out what you are supposed to do with the website or what information is most important.

The other interesting thing here is that while all three websites have a similar design theme, they are nothing alike. (Personally, I find this type of brutalism and the included animation a lot easier to understand and digest. It uses the harsh feeling that you want to associate with the style but adds an element of comprehension that’s incredibly valuable.)

Callshop Radio uses an almost magazine theme style, block design with big buttons, a simple animation, and flash of color.

BCKDRP features a more subtle richer, almost black background with blocky type and accented color without the harshness often associated with brutal styles.

Vision Get Wild may be the closest to true brutalism, but the animated element in the center of the screen has a simple softness that lightens the entire feel.

 

 

3. Futuristic Faces

The final trending design element this month is a fun take on faces. There’s a movement happening with a futuristic or artificial intelligence/cyborg-inspired look to the people featured in the designs.

It’s hard to say where this design inspiration is coming from, but it is fun to look at with so many ways to play the style. The other commonality seems to be the dominant use of female faces.

These computer-generated images start with photos that are brightened and smoothed so that all imperfections are lost. The faces have no lines, color that might not look 100% natural, and enhanced features that may or may not be realistic.

You aren’t quite sure if you are looking at a face from a video game or image in many instances.

The types of websites that are using this design trend are similar in content and fashion, art, gaming, portfolios, and AI themes, among the most popular.

The true common thread is imagination. This type of design element can’t come to fruition without a strong vision and the ability to see the vision through creation.

These examples use progressively more futuristic variations of the trend:

HueLe Museum: The least AI-looking of the examples, has imagery with super bright light on faces to remove lines and imperfections so that the models almost the look of mannequins.

Jenny Lin: The portfolio design shows the designer in a style representing her work with a headshot that features an augmented reality, or digital design feel with an almost plastic-looking, on-the-verge of cartoon style.

Ruby9100m: The imagery here is full-on futuristic. From coloring to facial features to an almost Frankenstein-pieced-together look, nothing about this image insists on reality. (Did you notice the blue hand?)

 

 

Conclusion

This month’s design trends are a lesson in experimentation and evolution of other visual concepts. They also create an immediate impact on you in terms of emotion because of strong design choices.

Trends like these tend to come and go quickly; nevertheless, it will be interesting to see how they evolve.

Source

p img {display:inline-block; margin-right:10px;}
.alignleft {float:left;}
p.showcase {clear:both;}
body#browserfriendly p, body#podcast p, div#emailbody p{margin:0;}

The post 3 Essential Design Trends, July 2021 first appeared on Webdesigner Depot.