How to Fix Stocks App Error in iOS12 and macOS Mojave

Original Source: https://www.hongkiat.com/blog/how-to-fix-stocks-app-not-working-ios12-mojave/

A step-by-step guide on fixing the Stocks app error in iOS12 and macOS Mojave.

The post How to Fix Stocks App Error in iOS12 and macOS Mojave appeared first on Hongkiat.

Visit hongkiat.com for full content.

The Future of E-Commerce Platforms

Original Source: https://www.webdesignerdepot.com/2018/10/the-future-of-e-commerce-platforms/

The competition in the e-commerce market is fiercer than ever, as brands wrangle to outdo rivals by deploying the latest techniques and practices technology can offer. However, it’s hard to predict an industry-leader for a longer duration with the future of e-commerce constantly shifting.

This fast-paced evolution of e-commerce has not only expanded the digital footprint of online brands but also served as an impetus to accelerate the performance of shopping carts and increase revenues for online merchants.

Retailers aren’t the only ones affected. Platform developers are also facing the challenge of meeting the demands of multi-channel users. All these criteria along with the expectation of improved delivery times, customer service and greater product selection will define the future of e-commerce platforms.

1. Personalization & Customer Experience

E-commerce personalization and enhanced user experience will remain as the leading fundamentals of e-commerce. Customer purchase decisions will be influenced by a combination of showrooming and webrooming; product demos as well as the unique in-store experience being offered by retailers. Hence, e-commerce platforms will evolve continuously to offer the great versatility and depth-of-choice that comes with online shopping, along with the option of in-store purchasing, collection, or returns.

e-commerce platforms will evolve continuously to offer the great versatility and depth-of-choice

Creating a perfect profile of your customers is essential to fabricate a hyper-personalized shopping experience. For many years, retailers have relied on “online-to-offline,” or O2O business tactics that include online ads, coupons, and other enticements to nudge customers into the sales funnel. However, as consumers are growing more mindful, the O2O is slowly deteriorating and we’re starting to see what might be described as an “O2O 2.0” approach.

Walmart Inc. stores in China now allow shoppers to pay for their purchases via WeChat, a multi-purpose Chinese messaging, social media, and mobile payment application. Developed by Tencent, WeChat analyzes the data on consumers’ shopping habits and preferences to suggest shopping lists, coupons, and other items.

2. Integration of AI Systems

Rapidly changing and improving technology will define the next big step for the online-retail industry, which is the full automation of the processes across e-commerce platforms. AI systems integrated with e-commerce platforms can run algorithms to determine the optimum conditions for the sales procedure, highest converting design, etc. for every unique online shop. By using algorithms to effectively run tests, optimize settings and repeat the process on loop, retailers can maximize their web store capabilities and yield higher conversions.

It also anticipated that visual content will play a more important role in buying decisions. While internet giants like Google and eBay have already launched their own versions of visual search—which are still very much in their infancy—retailers like West Elm are also capitalizing on latest AI technologies to add similar functionality to their stores.

In future stores will allow shoppers to input their height, weight, complexion, favorite color etc, and then suggest clothing purchases based on those results.

Digital marketing and e-commerce gurus predict that e-commerce platforms will integrate artificial intelligence and machine learning technologies into the shopping experience. This will give retailers more control over the buying process by gathering and storing information about shoppers’ buying habits. In future stores will allow shoppers to input their height, weight, complexion, favorite color etc, and then suggest clothing purchases based on those results. Retailers could use augmented reality to allow customers try on clothes virtually and further suggest other clothing items like shoes or trousers to round out a complete outfit.

3. Measurement Across All Devices

Owing to the abundance of available devices, consumers are now actively engaging on multiple devices at once. This means that e-commerce platforms must create solutions that can help retailers to engage with customers on all fronts. Though a lot of e-commerce platforms like Magento and WooCommerce are already providing extensions that can easily create a native mobile app for your store; a seamless mobile checkout is still a challenge for many platforms.

M-commerce has already achieved the trillion dollar mark

M-commerce has already achieved the trillion dollar mark and will slowly overtake e-commerce in the near future. Moreover, updates from Google mean that the mobile version of your website will soon become a starting point for what Google includes in their index, and an important ranking factor. So in future, we anticipate e-commerce platforms will work on providing more innovative mobile-friendly solutions for retailers.

4. The Decline of Monolithic Platforms

Traditional e-commerce platforms are inflexible and do not support features for performing dedicated tasks. To further define the problem, the issues can be broken down into three broad categories:

The first issue is the wastage of resources, that comes with getting a powerful server to handle the load from seasonal shoppers, but which may otherwise remain dormant during the rest of the year.
Secondly, servers in a certain physical location may not be able to provide the performance and speed to customers in another country. This can be a major setback to efforts in converting global customers.
Lastly, by housing all required servers in one location, they become more vulnerable to online attacks, server crashes, and numerous other issues—especially if the servers lack a backup. It can lead to major complications and tarnish brand reputation as well as loss of income.

To enhance customer experience retailers need to incorporate all sorts of customer analytics into their offerings. The efficiency ratio of this procedure varies depending on the platform and presents its challenges.

Converting a monolithic web application into smaller and simpler services…increases your website’s efficiency and its ability to scale

Custom-built platforms can successfully address these issues, but it can be a daunting task requiring a big team of highly-experienced developers working on the development and continuous optimization. Converting a monolithic web application into smaller and simpler services not only increases your website’s efficiency and its ability to scale, but will also allow you to react more quickly to change.

Back in 2011, Best Buy broke down its monolithic website into separate web services. This immediately benefited both the company and its customers. (However; this can be an expensive option for small retailers, who more than likely will not be able to rationalize these costs.)

5. Using Hyperscale Computing

Hyperscale is not only cost-effective and provides more space for innovation, but it allows retailers to explore different solutions for individual services. Moreover, retailers will have more freedom in managing the expenses and will be freed from the need of making a permanent commitment. Retailers will be able to focus on development in areas that highlight their strengths and attract customers in a highly competitive market.

There is no debate that cloud computing has helped e-commerce entrepreneurs to save both time and resources. It has opened the world to consumers and online retailers. Walmart has spent more than five years and millions of dollars just to built its own internal cloud network; this clearly indicates their determination to grab a bigger slice of online shopping and is an inspiration for all online retailers to quickly move to cloud computing not only to increase their sales but to improve their in-store operations as well.

What the Future Holds

New technologies and the latest products are increasingly changing the manner in which most consumers shop online. Innovative devices, such as Google Home, are decreasing the number of steps required for completing a purchase. Consumers can create a wish list using Google Home and directly place their orders without even launching a web browser or other apps.

the most elaborate solution may not necessarily be the most effective

Social media channels have also become a big part of the online retail process. They have proven effective means of advertising products according to demographics and specific customer behavior. More importantly, customers can use the social media channels to gain direct access to the e-commerce platform. The future of these integration tools seems to suggest that soon customers may even be able to purchase by simply selecting a product image displayed on a social media channel.

While the complexities of e-commerce continue to increase, retailers are starting to learn the most elaborate solution may not necessarily be the most effective.

Reducing the e-commerce platform into manageable sections and utilizing consumer data to better develop functions to address specific customer behavior are approaches which will set retailers on the track to prepare for the near future of e-commerce.

 

Featured image via DepositPhotos.

Add Realistic Chalk and Sketch Lettering Effects with Sketch’it – only $5!

Source

p img {display:inline-block; margin-right:10px;}
.alignleft {float:left;}
p.showcase {clear:both;}
body#browserfriendly p, body#podcast p, div#emailbody p{margin:0;}

Adobe MAX 2018: news, launches and all the action from the creative conference

Original Source: http://feedproxy.google.com/~r/CreativeBloq/~3/Vw378P2AU1w/adobe-max

Adobe MAX 2018 is nearly here. At this annual gathering of over 12,000 graphic, web and multi-disciplinary designers, art directors, film, video and motion graphics pros and photographers, inspiration is very much the name of the game.

Leading creative conference Adobe MAX offers over 300 sessions, labs and creativity workshops taught by industry leaders to help ignite your imagination and grow your career. This is a place where you can learn about industry trends, see new products and technology in action and kickstart your creativity all under one roof. 

When and where is Adobe Max 2018?

This year's Adobe MAX is being held at the Los Angeles Convention Centre from the 15-17 October. If you can make it, we'll see you in (hopefully) sunny LA. But if you're not able to get to the Southern California city, never fear, you can still join in the action via Adobe's live stream. 

And action aplenty is pretty much guaranteed. Adobe always pulls out all the stops for its annual conference, and Adobe MAX 2018 is no exception. Among the keynote speakers you'll find legendary producer and director Ron Howard, DJ and musician Questlove, comic book artist Nicola Scott, YouTuber Lilly Singh, and photographer Albert Watson. 

Adobe MAX attendees will also be invited to witness all the latest and greatest updates to Adobe's Creative Cloud suite, as well as learn some techniques to create amazing content. The learning sessions range from lectures to hands-on demonstrations in everything from photography and videography to prototyping and character design – there really is something for everyone! 

What we already know about Adobe Max 2018

sneaks Adobe Max 2017

Last year’s Sneaks session revealed a number of new innovations, and this year is sure to be no different

For the past few weeks, Adobe has teased with a number of sneak previews of the updates made to some of its most popular creative apps. These included details of a supercharged Content-Aware Fill tool in Photoshop and an exciting new toolbar feature in Illustrator. 

These are no doubt the tip of the iceberg, with Adobe sure to wow audiences at Adobe MAX 2018 with even more handy new features and updates to help creatives realise their full artistic potential. While we currently don't know what these might be, we're super-excited to be reporting live from the event, so stay tuned for all the top Adobe MAX news. Don't forget to follow us on Twitter, Facebook and Instagram for the latest updates, and keep checking this page for all the latest news. 

Read more:

7 insane tech sneaks from Adobe MAXAdobe XD and Adobe Dimension launched at Adobe MAXHow to use Adobe Capture CC

CSS Debugging and Optimization: Code-quality Tools

Original Source: https://www.sitepoint.com/css-debugging-and-optimization-code-quality-tools/

The following introduction to CSS code-quality tools is an extract from Tiffany’s upcoming book, CSS Master, 2nd Edition, which will be available shortly.

On your road to becoming a CSS master, you’ll need to know how to troubleshoot and optimize your CSS. How do you diagnose and fix rendering problems? How do you ensure that your CSS creates no performance lags for end users? And how do you ensure code quality?

Knowing which tools to use will help you ensure that your front end works well.

In this article, we’ll discuss tools that help you analyze the quality of your CSS. We’ll focus on two:

stylelint
UnCSS

stylelint is a linting tool. A linter is an application that checks code for potential trouble spots, enforcing coding conventions such as spaces instead of tabs for indentation. stylelint can find problems such as duplicate selectors, invalid rules, or unnecessary specificity. These have the greatest impact on CSS maintainability.

UnCSS, on the other hand, checks your CSS for unused selectors and style rules. It parses a stylesheet and a list of HTML pages, returning a CSS file that’s stripped of unused rules.

Both of these tools use Node.js and can be installed using npm.

If you’re working on a small site, such as a personal blog or a few pages that are updated infrequently, many of the problems that these tools flag can safely be ignored. You’ll spend time refactoring for little gain in maintainability and speed. For larger projects, however, they’re invaluable. They’ll help you head off maintainability problems before they start.

stylelint

stylelint helps you avoid errors and enforce conventions in your styles. It has more than 160 error-catching rules and allows you to create your own as well via plugins.

stylelint Installation and Configuration

Install stylelint as you would any other npm package:

npm install -g stylelint

Once it’s installed, we’ll need to configure stylelint before using it. stylelint doesn’t ship with a default configuration file. Instead, we need to create one. Create a .stylelistrc file in your project directory. This file will contain our configuration rules, which can use JSON (JavaScript Object Notation) or YAML (YAML Ain’t Markup Language) syntax. Examples in this section use JSON.

Our .stylelistrc file must contain an object that has a rules property. The value of rules will itself be an object containing a set of stylelist rules and their values:

{
“rules”: {}
}

If, for example, we wanted to banish !important from declarations, we can set the declaration-no-important to true:

{
“rules”: {
“declaration-no-important”: true
}
}

stylelint supports over 150 rules that check for syntax errors, indentation and line-break consistency, invalid rules, and selector specificity. You’ll find a complete list of rules and their available values in the stylelint User Guide.

Starting with a Base stylelint Configuration

You’ll probably find it easier to start with a base configuration and then customize it to your project needs. The stylelint-config-recommended base configuration is a good starting configuration. It enables all of the “possible errors” rules. Install it using npm:

npm install -g stylelint-config-recommended

Then, in your project directory, create a .stylelistrc file that contains the following lines:

{
“extends”: “/absolute/path/to/stylelint-config-recommended”
}

Replace /absolute/path/to/ with the directory to which stylelint-config-recommended was installed. Global npm packages can usually be found in the %AppData%npmnode_modules directory on Windows 10 systems, and in /usr/local/lib/node_modules on Unix/Linux and macOS systems. Type npm list -g to locate your global node_modules directory.

We can then augment our configuration by adding a rules property. For example, to disallow vendor prefixes, our .stylelistrc file would look similar to the what’s below:

{
“extends”: “/absolute/path/to/stylelint-config-recommended”,
“rules”: {
“value-no-vendor-prefix”: true
}
}

What if we wanted to limit the maximum specificity of our selectors to 0,2,0? That would permit selectors such as .sidebar .title but not #footer_nav. We can do this by adding a selector-max-specificity rule to our configuration:

{
“extends”: “/absolute/path/to/stylelint-config-recommended”,
“rules”: {
“value-no-vendor-prefix”: true,
“selector-max-specificity”: “0,2,0”
}
}

The post CSS Debugging and Optimization: Code-quality Tools appeared first on SitePoint.

10+ Social Media Tools To Help You Publish At The Right Time

Original Source: https://www.hongkiat.com/blog/when-post-social-media/

A list of useful tools to manage social networks by scheduling posts and analyzing performances etc.

The post 10+ Social Media Tools To Help You Publish At The Right Time appeared first on Hongkiat.

Visit hongkiat.com for full content.

Can this font improve your memory?

Original Source: http://feedproxy.google.com/~r/CreativeBloq/~3/RBwS7SfLz1w/can-this-font-improve-your-memory

Ever read a passage of text only to realise that you didn't take it in? It's a problem that affects a lot of readers, especially students cramming in exam season. To give text more traction and make it easier to remember, Melbourne-based researchers at RMIT University have created a fun font that makes reading harder.

The appropriately named Sans Forgetica does this by taking advantage of "desirable difficulty". This line of thought argues that a small obstruction aids the learning process by forcing a person to create a memory trace.

To make reading legible but also difficult, the Sans Forgetica typeface has been riddled with gaps and given a seven degree back slant. The result is a jarring font that requires an extra bit of effort on the reader's part. It only takes a fraction of a second longer to read, but Sans Forgetica already appears to be making a difference.

Stephen Banham sitting at a desk with samples of Sans Forgetica

Sans Forgetica co-creator Stephen Banham advises you read the typeface in small bursts

As part of a study by the university, students noticed a small increase in memory retention when reading text in Sans Forgetica compared to Arial. The 400 participants were found to remember 57 per cent of Sans Fogetica text, and only 50 per cent when reading Arial. 

Despite the promising statistics, Sans Forgetica has limitations. Typography lecturer and Sans Forgetica co-creator Stephen Banham told The Guardian that the typeface is best suited to short passages. "You wouldn’t want novels printed in it, it would probably induce a headache."

Sans Forgetica took six months to develop and went through three different iterations. With a promising study behind it, it's hoped that the typeface could also be used to aid proofreading. You can download this free font here.

Related articles:

Take a look at the world's most rubbish fontFamous logos redesigned as fontsHow to use web fonts

Build a Simple API Service with Express and GraphQL

Original Source: https://www.sitepoint.com/build-a-simple-api-service-with-express-and-graphql/

This article was originally published on the Okta developer blog. Thank you for supporting the partners who make SitePoint possible.

GraphQL has become an immensely popular alternative to REST APIs. The flexibility you get from using GraphQL makes it easier for developers to get any information they need for an app, and just the information they need for that portion of the app. That gives you the feel of a very customized API and can help cut down on bandwidth.

In this tutorial, I’ll show you how to write a custom GraphQL API using Node and Express. I’ll also show you how to secure parts of the API while making other parts open to the public.

Create the GraphQL API with Express

To create the API, start by creating a new folder and creating a package.json file to manage your dependencies. You’ll also need to install a few dependencies to get GraphQL with Express up and running:

mkdir graphql-express
cd graphql-express
npm init -y
npm install express@2.8.4 express-graphql@0.6.12 graphql@14.0.2 graphql-tag@2.9.2 cors@2.8.4

Now create a file named index.js. This will be your main entry point:

const express = require(‘express’)
const cors = require(‘cors’)
const graphqlHTTP = require(‘express-graphql’)
const gql = require(‘graphql-tag’)
const { buildASTSchema } = require(‘graphql’)

const app = express()
app.use(cors())

const schema = buildASTSchema(gql`
type Query {
hello: String
}
`)

const rootValue = {
hello: () => ‘Hello, world’
}

app.use(‘/graphql’, graphqlHTTP({ schema, rootValue }))

const port = process.env.PORT || 4000
app.listen(port)
console.log(`Running a GraphQL API server at localhost:${port}/graphql`)

This is about as simple as a GraphQL server gets. All this does is return “Hello, world” when you query “hello”, but it’s a start. To take it for a test spin, run node ., then in another tab open your browser to the GraphQL Playground. Once there, enter http://localhost:4000/graphql to access your GraphQL server.

graphql playground - enter endpoint url

The GraphQL Playground will help explore your schema and test out queries. It even automatically creates some documentation for you.

graphql playground - hello world schema

Try querying for hello using the following query:

query {
hello
}

hello world

Improve Your GraphQL Developer Experience

Here are a couple quick tips to help make your development experience a little better:

1. Install a linter to help catch bugs in your editor. This will help keep your styling consistent and catch any easily-avoidable bugs.

To install StandardJS, type npm install –save-dev standard@12.0.1. Most editors will be able to show you warnings and errors as you type.

You can also edit the scripts object of your package.json so that you can run the linter at any time with npm test:

“scripts”: {
“test”: “standard”
},

2. Automatically restart the server when you make changes.

Install nodemon with npm install –save-dev nodemon@1.18.4.

Add another script to package.json, so you can run the server with npm start. Combined with the above, your scripts object should look like this:

“scripts”: {
“test”: “standard”,
“start”: “nodemon .”
},

Go ahead and close the server you had run with node . and now type npm start to restart the development server. From now on, any changes you make will automatically restart the server.

Create the GraphQL Queries

To get something a little more useful, let’s make a post editor. GraphQL is strongly typed, allowing you to create a type for each object and connect them. A common scenario might be to have a post with some text, that was written by a person. Update your schema to include these types. You can also update your Query type to utilize these new types.

type Query {
posts: [Post]
post(id: ID): Post
authors: [Person]
author(id: ID): Person
}

type Post {
id: ID
author: Person
body: String
}

type Person {
id: ID
posts: [Post]
firstName: String
lastName: String
}

Even though the resolvers aren’t set up, you can already go back to GraphQL Playground and refresh the schema by clicking the circular arrow icon next to the localhost URL.

url and refresh button

The schema explorer is really useful for figuring out how to create your query. Click the green SCHEMA button to check out your new schema.

full query schema

You’ll need some way to store the data. To keep it simple, use JavaScript’s Map object for in-memory storage. You can also create some classes that will help connect the data from one object to another.

const PEOPLE = new Map()
const POSTS = new Map()

class Post {
constructor (data) { Object.assign(this, data) }
get author () {
return PEOPLE.get(this.authorId)
}
}

class Person {
constructor (data) { Object.assign(this, data) }
get posts () {
return […POSTS.values()].filter(post => post.authorId === this.id)
}
}

Now if you have an instance of a Person, you can find all of their posts by simply asking for person.posts. Since GraphQL lets you only ask for the data you want, the posts getter will never get called unless you ask for it, which could speed up the query if that’s an expensive operation.

You’ll also need to update your resolvers (the functions in rootValue) in order to accommodate these new types.

const rootValue = {
posts: () => POSTS.values(),
post: ({ id }) => POSTS.get(id),
authors: () => PEOPLE.values(),
author: ({ id }) => PEOPLE.get(id)
}

This is great, but there’s no data yet. For now, stub in some fake data. You can add this function and the call to it right after the assignment to rootValue.

const initializeData = () => {
const fakePeople = [
{ id: ‘1’, firstName: ‘John’, lastName: ‘Doe’ },
{ id: ‘2’, firstName: ‘Jane’, lastName: ‘Doe’ }
]

fakePeople.forEach(person => PEOPLE.set(person.id, new Person(person)))

const fakePosts = [
{ id: ‘1’, authorId: ‘1’, body: ‘Hello world’ },
{ id: ‘2’, authorId: ‘2’, body: ‘Hi, planet!’ }
]

fakePosts.forEach(post => POSTS.set(post.id, new Post(post)))
}

initializeData()

Now that you have your queries all set up and some data stubbed in, go back to GraphQL Playground and play around a bit. Try getting all the posts, or get all the authors and posts associated with each one.

query all posts

Or get weird and get a single post by id, then the author for that post, and all of that author’s posts (including the one you just queried).

get weird

Add User Authentication to Your Express + GraphQL API

One simple way to add authentication to your project is with Okta. Okta is a cloud service that allows developers to create, edit, and securely store user accounts and user account data, and connect them with one or multiple applications. If you don’t already have one, sign up for a forever-free developer account.

You’re going to need to save some information to use in the app. Create a new file named .env. In it, enter in your organization URL.

HOST_URL=http://localhost:4000
OKTA_ORG_URL=https://{yourOktaOrgUrl}

You will also need a random string to use as an App Secret for sessions. You can generate this with the following command:

echo “APP_SECRET=`openssl rand -base64 32`” >> .env

Next, log in to your developer console, navigate to Applications, then click Add Application. Select Web, then click Next.

create new application settings

The page you come to after creating an application has some more information you need to save to your .env file. Copy in the client ID and client secret.

OKTA_CLIENT_ID={yourClientId}
OKTA_CLIENT_SECRET={yourClientSecret}

The last piece of information you need from Okta is an API token. In your developer console, navigate to API -> Tokens, then click on Create Token. You can have many tokens, so just give this one a name that reminds you what it’s for, like “GraphQL Express”. You’ll be given a token that you can only see right now. If you lose the token, you’ll have to create another one. Add this to .env also.

OKTA_TOKEN={yourOktaAPIToken}

Create a new file named okta.js. This is where you’ll create some utility functions, as well as get the app initialized for Okta. When authenticated through Okta, your app will authenticate through an access token using JWT. You can use this to determine who a user is. To avoid dealing directly with authentication in your app, a user would sign in on Okta’s servers, then send you a JWT that you can verify.

okta.js

const session = require(‘express-session’)

const OktaJwtVerifier = require(‘@okta/jwt-verifier’)
const verifier = new OktaJwtVerifier({
clientId: process.env.OKTA_CLIENT_ID,
issuer: `${process.env.OKTA_ORG_URL}/oauth2/default`
})

const { Client } = require(‘@okta/okta-sdk-nodejs’)
const client = new Client({
orgUrl: process.env.OKTA_ORG_URL,
token: process.env.OKTA_TOKEN
})

const { ExpressOIDC } = require(‘@okta/oidc-middleware’)
const oidc = new ExpressOIDC({
issuer: `${process.env.OKTA_ORG_URL}/oauth2/default`,
client_id: process.env.OKTA_CLIENT_ID,
client_secret: process.env.OKTA_CLIENT_SECRET,
redirect_uri: `${process.env.HOST_URL}/authorization-code/callback`,
scope: ‘openid profile’
})

const initializeApp = (app) => {
app.use(session({
secret: process.env.APP_SECRET,
resave: true,
saveUninitialized: false
}))
app.use(oidc.router)
app.use(‘/access-token’, oidc.ensureAuthenticated(), async (req, res, next) => {
res.send(req.userContext.tokens.access_token)
})
}

module.exports = { client, verifier, initializeApp }

The initializeApp function adds some middleware to allow you to log in with Okta. Whenever you go to the http://localhost:4000/access-token, it will first check that you’re logged in. If you aren’t, it will first send you to Okta’s servers to authenticate. Once authentication is successful, it returns you to the /access-token route and will print out your current access token, which will be valid for about an hour.

The client that you’re exporting allows you to run some administrative calls on your server. You’ll be using it later to get more information about a user based on their ID.

the verifier is what you use to verify that a JWT is valid, and it gives you some basic information about a user, like their user ID and email address.

Now, in index.js, you’ll need to import this file and call the initializeApp function. You also need to use a tool called dotenv that will read your .env file and add the variables to process.env. At the very top of the file, add the following line:

require(‘dotenv’).config({ path: ‘.env’ })

Just after the app.use(cors()) line, add the following:

const okta = require(‘./okta’)
okta.initializeApp(app)

To make this all work, you’ll also need to install a few new dependencies:

npm i @okta/jwt-verifier@0.0.12 @okta/oidc-middleware@1.0.0 @okta/oidc-sdk-nodejs@1.2.0 dotenv@6.0.0 express-session@1.15.6

You should now be able to go to http://localhost:4000/access-token to log in and get an access token. If you were just at your developer console, you’ll probably find you’re already logged in. You can log out of your developer console to ensure the flow works properly.

Create GraphQL Mutations

Now it’s time to use real data. There may be some real John and Jane Does out there, but chances are they don’t have an account on your application yet. Next, I’ll show you how to add some mutations that will use your current user to create, edit, or delete a post.

To generate IDs for a post, you can use uuid. Install it with npm install uuid@3.3.2, then add it to index.js with:

const uuid = require(‘uuid/v4’)

That should go near the top of the file, next to the other require statements.

While still in index.js, add the following types to your schema:

The post Build a Simple API Service with Express and GraphQL appeared first on SitePoint.

The best full-frame cameras in 2018

Original Source: http://feedproxy.google.com/~r/CreativeBloq/~3/vKrBzW2PyXA/best-full-frame-cameras

Full-frame cameras have a lot to offer the creative photographer, and they come in a wide range of shapes and sizes. To help you figure out which is the right model for you, we've rounded up our pick of the best full-frame cameras in a range of different categories. If you're not sure it's a full-frame camera you want, we also have a guide to the best camera for creatives.

So what exactly is a full-frame camera, and why would you want one? Full-frame cameras can deliver a tighter depth of field than models with a crop sensor, which can be a major bonus in portraiture and still life photography. 

The fact that the image sensor has a physically larger surface area can be a key advantage in other ways as well. Manufacturers can cram extra megapixels onto the sensor, increasing the potential for capturing ultra-fine detail and texture. Alternatively, they can stick to a more modest megapixel count and increase the size of the actual photosites, which equate to pixels in the resulting image. Bigger photosites enable the camera to capture more light, which can result in less image noise when shooting at high ISO (sensitivity) settings.

Many photographers still prefer conventional DLSRs, with their reflex mirrors and optical viewfinders. However, there's a growing range of mirrorless 'compact system cameras' on the market, with Sony offering a number of full-frame bodies and companion lenses in its lineup. 

Let's check out the best full-frame cameras on the market to suit your budget and specific requirements.

With the notable exceptions of the 5DS and 5DS R, Canon's highly regarded EOS 5D series of cameras have never set the world alight in terms of megapixel count. True to form, the latest Mk IV weighs in with a 30.4MP image sensor, which turns out to be a very good compromise. It enables the camera to capture fine detail extremely well, while also maintaining very clean image quality at high ISO settings, along with a fairly fast 7fps maximum drive rate. The autofocus systems are excellent, with a 61-point phase-detection module for shooting stills through the viewfinder, and Dual Pixel AF for live view and movie capture, the latter of which is available at 4K UHD.

Sony's latest flagship mirrorless camera packs a full-frame sensor and dual memory card slots into a typically small and lightweight package. The sensor itself might look unimpressive, with a 24.2 megapixel count, but it's a stacked CMOS device with onboard processing and memory. 

Advantages include low-noise image quality at very high ISO settings, and blistering continuous drive speeds of up to 20fps, complete with autofocus tracking. An electronic shutter is also on hand, to enable shutter speeds of up to 1/32000th of a second, so you can freeze even the fastest action. The electronic viewfinder is absolutely outstanding and the rear touchscreen is nice and clear, although it only has a tilt facility and lacks full articulation.

For outright resolving power, the 45.4MP Nikon D850 clearly wins out against the 30.4MP Canon 5D Mk IV. And despite having 50 per cent more megapixels, it matches the Canon for maximum drive rate, at 7fps. The rear screen is also ultra-high-res, and very easy on the eye. As a pro-grade Nikon, it has a substantially different control layout to consumer-grade cameras like the D750. It's more like a scaled-down Nikon D5, without the built-in vertical grip. As such, it's reasonably small and lightweight for a pro-grade DSLR. 

The only real downside is that, for shooting under low lighting conditions at high ISO settings, image noise can be rather noticeable, especially compared with the likes of the Canon 5D Mk IV and the super-smooth Nikon D750.

This is our pick for the best full-frame budget camera on the market. It took six years for the Mark II edition of Canon's 'enthusiast' level full-frame DSLR to topple the original 6D from its throne. It's been well worth the wait, as the main autofocus system gets a mighty upgrade from 11 AF points with only a single cross-type point, to 45 AF points, all of which are cross-type for greater accuracy. 

The sensor-based autofocus system for live view and movie capture gets an even bigger upgrade, with a dual pixel AF sensor that makes focusing massively faster. The maximum drive rate is 2fps faster at 6.5fps, and the new model features 5-axis stabilisation for movie capture. However, this isn't available for shooting stills, and movies themselves are limited to 1080p rather than 4K. Even so, the excellent fully articulated touchscreen will benefit those shooting movies as well as live view stills.

What you see is what you get with this camera. The immensely detailed and super-sharp electronic viewfinder has crystal clarity, reflected in the ultra-high definition stills that are captured by the 42.4MP image sensor. 4K UHD movie capture is just as much of a treat, as the A9 delivers wonderfully sharp and detailed results, helped along by its 5-axis image stabiliser. Overall 4K movie quality beats that of any regular DSLR currently on the market, and you can boost resolution to 5K in 'Super 35mm' mode. Advanced functions to suit serious videographers include a clean HDMI output, zebra display, time code and slow/quick motion, to name but a few.

Costing two-thirds of the price of the A7R III and little more than half the price of the A9, the A7 III is the most 'sensible' option for those hunting for the best full-frame camera for travel. There's no shortage of advanced features, including a back-illuminated image sensor that enables very clean high-ISO images (more so than in the A7R III), a fabulously fast and reliable hybrid autofocus system, speedy 10fps continuous stills shooting, and 4K video capture. 

With its small, lightweight build, it's eminently suitable for travel photography and, while the A9 and A7R III are also very travel-friendly, the A7 III edges ahead in terms of battery life, with up to 610 or 710 shots per charge, using the viewfinder or rear screen respectively. If you're going to be hitting the beach or engaging in adventurous activities on your travels, it's also nice not to be packing quite such an expensive camera.

If you're after the best full-frame camera for sports or wildlife photography, look no further than the Canon EOS-1D X Mark II. Many pros love this DSLR simply for its handling characteristics. With a built-in vertical grip that fully duplicates all the important shooting controls, it feels equally natural to use in portrait or landscape orientation.

The camera really comes into its own for action sports and wildlife photography where, for a DSLR at least, it delivers a super-fast continuous drive rate of 14fps, and as much as 16fps in live view mode. The 61-point autofocus system makes a spectacularly good job of keeping tabs on fast or erratically moving objects, with plentiful tracking options to choose from. The shooting speed is helped by the modest megapixel count of 20.2MP, but this also ensures relatively noise-free image quality when you need to shoot at very high ISO speeds, for example when freezing the wildlife action at twilight, or for indoor sports.

Inspired by classic yesteryear Nikon 35mm stills cameras, the Df will appeal to photographers of a certain age or inclination. It has a plethora of hands-on, dedicated dials up on top, for adjusting shooting parameters like ISO, shutter speed and exposure compensation, as well as the usual shooting buttons and dials on the front and back. Based on the same image sensor and processor as the flagship D4 (which has now been superseded by the D5), the Df is also starting to look a bit retro in terms of its 16.2 megapixel count. An upside is that high-ISO images are fairly noise-free. A major downside for many modern photographers is that Nikon has taken the 'retro' theme to the extreme by stripping out any video capture facility from the camera.

With a similar price tag to the Canon 6D Mark II, the older Nikon D750 almost matches it for megapixel count, with a 24.3MP sensor. The D750 is equally able to capture fine detail and texture but draws slightly ahead in minimising image noise at very high ISO settings. It's far better than the Nikon D850 in this respect, making the D750 a better proposition for shooting indoors or under very low lighting without resorting to flash. This can be a particular plus point for wedding photographers and others needing to shoot indoor events. Another upside for capturing important, unrepeatable events is that, unlike the Canon 6D Mark II, the D750 has dual memory card slots, so you can create instant backups of every shot you take, on separate cards.

With a keen eye for detail, the K-1 Mark II has a 36MP image sensor with no anti-alias filter, and can deploy its 5-axis sensor-shift image stabiliser in a variety of ways. For starters, it can reduce camera-shake in handheld shooting with up to 5-stop efficiency. There are also tripod and handheld modes for shifting pixels between successive shots, to enhance the capture of ultra-fine detail. 

For shooting the night sky, there's a more intriguing 'Astrotracer' mode. This employs the camera's internal GPS module and electronic compass for astrophotography. The latitudinal position on the globe, plus its direction and horizontal/vertical tilt are all measured automatically. Calculations are performed and the image stabiliser shifts the sensor throughout the exposure. This effectively tracks the movement of the moon, stars and other celestial bodies, so that they don't blur or appear to streak through the night sky.


Software Inevitably Changes – WordPress is No Exception

Original Source: http://feedproxy.google.com/~r/1stwebdesigner/~3/dlgaEx9F8qI/

For those of us who work on the web, the tools we use are incredibly important. We tend to get attached to them. Some of us even go out of our way to promote a particularly good one.

Over the past 15 years, WordPress has been a tool that perhaps benefitted from this loyalty like no other. A small community of diehard supporters has turned into a massive one. There’s a marketplace for themes and plugins. There are numerous users who volunteer their time in capacities official and unofficial. Today, the WordPress community is a force to be reckoned with.

As WordPress has grown into the CMS of choice for so many, so has the criticism of its continued evolution. And with the new Gutenberg editor, things have reached a fever pitch.

This raises a few important questions. How much weight should critics carry? And, what should we reasonably expect from WordPress in terms of new features? Let’s dive in and see if we can find some answers.

The Sky is Falling…Or Not

The coming rollout of Gutenberg in WordPress 5.0 has garnered a lot of opinions. Since the first steps towards its creation, there has been a mixture of excitement and dread within the community.

Then, as Gutenberg was released as a beta plugin, the stuff really started hitting the fan. While we won’t go over every criticism, suffice it to say that some users expressed concerns of sites breaking due to theme or plugin incompatibilities and a buggy UI. Then there were those who had philosophical objections to parts of, or even the very existence of the project.

There are indeed a number of legitimate concerns. But there has also been an element of what I’ll respectfully describe as fear of the unknown. It too has a place in the conversation. But so often it seems to shout over everything else without adding anything productive.

However, that fear of the unknown should fade over time. As users become more accustomed to a change, it stands to reason that they won’t have nearly as much anxiety.

Personally, this has been my own experience with Gutenberg. The more I use it and the more bug fixes that get released, the more comfortable I am. Not to say that there still aren’t plenty of areas for improvement. But at least I’m starting to see a light at the end of the tunnel.

Still, it seems like there is a lot of emotion out there. We’re seeing a number of 1-star reviews and some developers have even begun to develop their own forks of WordPress. Fair or not, people are attached to the old way of doing things.

Scrabble letters spelling "FEAR"

Change is Natural

What tends to get lost amongst all the hype is that, if software hangs around long enough, it’s going to change over time. WordPress just happens to be at a point where its wide usage is calling more attention to these changes.

Operating systems, for example, are famous for annoying a subset of users with UI and feature changes (I’m looking at you, Microsoft). Not everyone likes to change the way they work, even if the end result really is an improved product. There is something to be said for comfort and predictability. When that’s disrupted, users cringe.

WordPress is reported to power over 30% of the web. So, it makes sense that a major change such as Gutenberg would cause some unrest. That number covers a whole lot of preferences, use cases and opinions.

The trick for any software developer is that they have to balance the greater need of maintaining a viable product with keeping existing users happy. There are no easy answers, and WordPress certainly isn’t immune from having to make these difficult decisions.

Person balanced on railroad tracks

You Can’t Always Get What You Want

That leads us to Gutenberg. There was, whether we agree with it or not, a determination that the Classic Editor was becoming outdated. Eventually, it was decided that a new editor was the best way to address the issue.

Knowing that you can’t please everyone, the preferred course of action is to create the best product you can. Take care to ensure that it works on as many existing websites as possible. Take constructive criticism seriously and make compromises where you can.

WordPress has even taken this a step further. Instead of forcing Gutenberg on everyone, they have also provided an alternative path. The Classic Editor plugin keeps the familiar content editing experience and will be supported for the foreseeable future.

While that may not be the perfect solution to some, it is a way forward for those who don’t want to (or unable to) change.

Even with that compromise, there will be some users who refuse to come along for the ride. While that’s certainly not what WordPress wants, it is part of the price you pay for implementing major change. You might say that it’s a philosophy of knowing that you’ll lose some users now, with the hopes of making greater gains in the future.

Gutenberg WordPress editor

Gutenberg is Part of a Constant Evolution

I am very much a creature of habit when it comes to how I work. For me, change means that I have to take precious time out of my day to relearn how a tool works. It disrupts my routine. The whole experience is generally not something I would voluntarily seek out.

But I’ve also come to the point of realizing that change is inevitable. And it often pushes things in the right direction. If it didn’t, I’d still be writing HTML by hand and using tables for layout.

What’s interesting about the Classic Editor is that, in an industry that changes so quickly, it has managed to stick around for a very long time. Sure, it’s undergone incremental improvements. But the basic experience has been the same. It’s always familiar and comfortable – even if it occasionally is a pain to work with.

Still, things move forward. Web design is a field that constantly challenges us to adapt to what’s new. For better or worse, Gutenberg is just one more change. We can expect that there will be more to come.


Multibox Menu

Original Source: http://feedproxy.google.com/~r/tympanus/~3/Cc8M9UBRXKU/

Today we’d like to share a simple fullscreen menu with you. The menu layout is powered by CSS Grid and we make its boxes appear with a reveal animation. The animations are made using TweenMax.

MultiboxMenu

The demo is kindly sponsored by Milanote: your new creative hub. If you would like to sponsor one of our demos, find out more here.

Attention: Note that we use modern CSS properties that might not be supported in older browsers.

For the main page we have a simple layout with a little motion effect on the background image.

MultiboxMenu01

When clicking on the menu icon in the top right corner, the menu opens.

MultiboxMenu02

The menu boxes reveal following an order that we define on each element.

MultiboxMenu

We hope you like this little menu and find it useful!

References and Credits

Images from Unsplash.com
TweenMax by Greensock
Map made with Fantasy Map Generator
imagesLoaded by Dave DeSandro

Multibox Menu was written by Mary Lou and published on Codrops.