Category: CleanTalk

  • Strengthening brute force protection

    We added the new logic to prevent brute force attacks. Service will check your log in status once per hour and if some IP’s have 10 and more attempts to log in, then these IP’s will be banned for next 24 hours.

    It makes the brute force protection tougher and doesn’t waste the server’s resources on these IP’s.

    Download Security & Firewall by CleanTalk.

  • Compass Pool Centre Newcastle New Website Project 2016

    Old Website Needing to Be Rebuilt

    Compass Pool Centre Newcastle is an authorised Compass Pools Australia dealership based in Newcastle. We deliver high quality fibreglass swimming pools throughout the Newcastle and Hunter region in NSW, Australia. Thanks to hundreds of satisfied customers, we have had a lot of new leads each month. However, to reach more people interested in buying a swimming pool, we needed to improve our online presence.

    The old website was being used for several years and was not really updated. Thus, some of the information were outdated and other missing. The website did not feature enough content and because of this, we could not aim at higher positions in search engines’ search results pages. It needed a visual redesign, structural change, and technical modernisation to keep pace with current website development trends and industry best-practices.

    New Project Starting in April 2016

    In April 2016, the new website project started. Catnapweb.com.au was selected as the contractor for developing the new website. They have worked with Compass Pools Australia on several projects since December 2015 and thanks to this, they knew the environment and Compass products well and could build the new website including content and optimisation for search engines.

    In May 2016, the new website was launched on the compassnewcastle.com.au domain. It featured a completely new design, a lot of fresh content, and it was optimised for mobile devices and search engines. Within the new website, a lot of content is dedicated to precisely describe pool shapes and parameters unique to Compass pools, technologies that help customers enjoy Compass pool without the need of everyday care and show pool installation pictures as the inspiration for new customers. There are numerous unique features on the website. For example, each pool shape can be rotated in 3D by the website visitor, thus enabling exploring the pool from all sides. The shape page also features description of the shape, table with available sizes and nice photo gallery with selected installations of the pool shape.

    Unique Content on the Website

    The website features a lot of new content including secret weapons distinguishing Compass from other pool manufacturers and Advice section with useful information for all prospective and current pool owners. One of the innovations that were implemented on this website, is a set of Call to Actions. Except of the standard contact form, another chances to interact with the company were created and offered to website visitors. One of them is the page introducing printed version of Pool Magazine. This magazine was created by Compass Pools Australia as a comprehensive material containing the best information about pool ownership. The Pool Buyers Guide, as we call the magazine, has 70 pages full of information on selecting the right pool, costs to buy a pool, pool ownership costs and much more. It has been a great success to get in touch with people through this offer. Because the magazine is sent to everyone for free, many people have requested it since the launch of the new website.

    Also, another new service was added to the website. In 2016, Compass Pools Australia have introduced a new range of complementary products. These can be added to a standard fibreglass swimming pool to create unique, customised fibreglass pools. This concept has been named Customise your pool and we introduced it to the website visitors on a separate page. Here, they can find information about Maxi Rib technology, beach zones, grand entry benches, pool and spa water combos, and various water features.

    Technical Solution

    The new website runs on WordPress, currently the 4.7 version. It is built using The7 universal theme that has probably the most features on today’s theme market and is very flexible and versatile. There were several challenges we needed to go conquer when building the website. One of them was selecting the best solution for generating the 3D rotations. We already had 3D models of all our pools ready, now we needed to add surface material to it and export it to an image sequence usable for 3D rotation. We have spent hours and hours testing different number of pictures in the export, their quality and resolution and dozens of WordPress plugins. Finally, we have found out that the only suitable solution for us is the Smart Product Viewer plugin. We have found a balance between the usability of the rotation and the performance of the website. To optimise the user experience, we did not replace the static 3D pool shape representation located on top of the page, but added the interactive pool rotation to the bottom. With this solution in place, user coming to the website does not need to wait until the rotation is loaded and by the time he comes to this part, it is ready for interaction.

    The website contains several forms that collect information from website visitors. They are built with the use of Gravity Forms and stored in the database. Soon after the launch we realised that we are getting spam request sent through the forms. We discussed possible solutions and wanted to implement captcha to stop spam robots filling in the form details. However, Andrej from Catnapweb.com.au came with a suggestion to use CleanTalk, which was already tested on several other websites by him. This solution, despite being quite new, has many advantages over traditional protection. The main is that it does not require and action from the website user and thus, it does not decrease the conversion rate. After setting it up, we have monitored the weekly reports from CleanTalk to determine, if any real requests were not blocked. After several months of running CleanTalk we can happily confirm that the flasepositive rate is 0 and no real request was blocked by CleanTalk. It is very impressive and everyone is happy with this service.

    After 7 months running, we can see the website has started generating a lot of relevant requests. We have the chance to get in touch with people needing more information about swimming pools and we are happy to assist them in their purchasing a pool decision making process.

  • 29 Steps to audit your Website with your own hands and Top 7 Useful Website Audit Tools

    29 Steps to audit your Website with your own hands and Top 7 Useful Website Audit Tools

    Recently significantly expanded the list of aspects that must be considered when analyzing the quality of the site. First of all, this: mobile website optimization; regional resource optimization; the speed of loading pages and other components.

    We have tried to collect in the article the factors that you can directly affect your website and have not considered external factors such as receiving and quality of backlinks, guest posts, and promotion articles.

    For whom this article:

    • for web designers and developers who want to create sites that are originally optimized for SEO,
    • for owners of Internet resources, who want to understand by yourself SEO to increase search traffic.

    Well-structured check-list will allow producing a thorough website audit and finding out what specific factors are negatively affecting performance, and get technical recommendations that should be applied in further work on the resource.

    SEO audit

    1. Duplicates of pages (carefully look for different ways all the duplicates and work with them using the redirect and closing in robots.txt and using the attribute rel=”canonical”). Possible duplicates – http://www.site.ru/index.php (index.html, default.asp), closing slashes, pagination pages, the session variables in the url, search filters, tags of context and other advertising, etc. Other options for the duplicates: http:// and https://, /cat/dir/ and /dir/cat/, site.com and test.site.com, /session_id=123 and so on.
    1. Check for blank pages (which do not contain content). Blank pages can be:
    • removed,
    • closed from indexing (in file robots.txt, see below),
    • filled with content.
    1. “Endless pages” means such pages, where you can set the parameters to some other number and they will allow you to create an infinite number of duplicates for any page (this is often happen in calendars, pagination, products, etc.)
    1. Robots.txt (in Robots spelled out host, sitemap, closed all the service sections, pages to print, admin panel, test domains, URLs of pages from a previous website management system, etc.). Often, also recommend closing: the user profiles, page of creation new topic on forum, login page, personal account, personal messages page, search result page, shopping cart of online store.
    1. Check for redirects on the website (if there are superfluous that need to be removed). Types of redirects:
    • 301 — requested document permanently moved to new URL
    • 302 — the requested documents are temporarily available at another URL.
    1. Sitemap XML. In sitemap no service page and banned by the file robots and meta-tag robots.
    1. Is specified encoding in the code.
    1. Check the presence and the uniqueness of the tags title, description, keywords on every page (all these elements must be different for each page). Are there any missing Titles, meta description, keywords. How much Titles and Description are attractive for clicks.
    1. Try to add the title tag all the keywords, most popular keyword should be closer to the beginning of the tag.
    1. Use the symbol “|” to separate the different phrases in the tag title (for example, name of the page and site).
    1. The tag description does not affect the ranking of a website directly, but search engines may use the content for the snippet of a website in the results. The description length can be from 100 to 250 characters, ideally 155 characters. It is usually one or two meaningful sentences to describe the page, including search queries.
    1. Analysis optimized images on the website. To learn about how to optimize images, it is possible in the help section of Google.
    1. Failure rate (viewing one page), in which pages are many failures, make a list, fill them better.
    1. What are the main points of entry and exit of the site (better to study the content and usability of the main entry points, to analyze where and why they go).
    1. How many pages the average user views (maybe add interesting blocks on the page, “see also”, etc.).
    1. Add the favicon of the site.
    1. Styles and scripts should be loaded in head as separate files.
    1. The page can have only one header h1. The h1 heading should not copy title. Title was created to describe the page, not part of the content of the page. H1 describes the whole page, but h1 only certain content. They carry different meaning, which may coincide.
    1. The attributes alt and title for images should be different. Alt is alternative text for the image if it is not loaded. Title is the title of the picture that POPs up when you hover over the picture and goes in search.
    1. Use HTTPS. Google representatives say that the transition of the site on the HTTPS Protocol – with the addition of 2048-bit SSL key – improve your position on the results pages of the search. Google advises web developers to test their sites with HTTPS using the Qualys Lab tool.

    Technical audit of the website

    1. Check the speed of website loading. This is one of the important factors that affect the ranking of a website in search engines. You can check using Google PageSpeed or Google Search Console.

    To check what items take the longest to load and to consider options for optimization.

    https://tools.pingdom.com/
    https://gtmetrix.com/

    1. Check your website on mobile devices. This can be done in Google Search Console.
    1. The presence of errors in Google Search Console->Crawl->Crawl Errors.
    1. For some sites it makes sense to check the load testing by service http://loadimpact.com/ (free test the behavior of the hosting and the site load up to 50 visitors at a time).
    1. Using the Gzip Test tool, make sure that gzip compression on the server for the website is included.
    1. Check the website for viruses:
    • Norton Safe Web, from Symantec– So, how can you find out if a Web site is a safety risk before you visit it? Norton Safe Web is a new reputation service from Symantec. Our servers analyze Web sites to see how they will affect you and your computer.
    • AVG Online Web Page Scanner – lets you check the safety of individual web pages you are about to visit. LinkScanner will examine the web page in real time to see whether it’s hiding any suspicious downloads.
    1. Checks your web pages for broken links.
      https://validator.w3.org/checklink
      http://www.siteliner.com/ – Find duplicate content, broken links, and more. The free scan is limited to 250 pages.
    1. Checks your Cascading Style Sheets (CSS).
      http://jigsaw.w3.org/css-validator/
    2. Checks HTML for errors or warnings
      https://validator.w3.org/nu/
      http://htmlhelp.com/tools/validator/

    Powerful Website Audit Tools You Should Check

    SEO Site Checkup
    SEO Site Checkup is a free analysis tool that audits the entire website with 45 checks in 6 different categories (common SEO issues, server&security, mobile usability, social media and semantic web). With due consideration of these results, the tool shows up a general score and a number of failed checks.

    Seoptimer
    Seoptimer is a free auditing tool that helps you instantly report critical errors on your website in seconds and recommend what you should do to improve your search rankings.

    SiteAnalyzer
    The tool tests your website’s effectiveness based on 50 parameters, instantly identifies problem areas to fix and shows up all SEO technical mistakes.

    SE Ranking Website Audit
    Not for only identifying website errors at a glance, but also for preparing a list of tasks for website developers, content writers and web designers to prevent from technical issues. It crawls your entire website based on over 70 parameters such as Domain overview, Pages, Meta, Content, Images and Links Analysis, Mobile Optimization, Usability and Technologies.

    Моz Crawl Test
    Moz runs its own site crawler that helps webmasters to check out critical issues, HTTP status codes and other useful data. It also figures out duplicate content, errors in the title tag, server redirects and many other factors that can affect website crawlability.

    SimilarWeb
    With its platform called SimilarWeb, the company uses big data technology for the collection, measurement, analysis and provision data on behavioral patterns and statistics of user engagement of websites and mobile applications. A similar tool Alexa collects statistics from users who have installed a special toolbar. In the case SimilarWeb data comes from their own crawler + data from millions of users with a special toolbar. Is possible to know the list of websites that your visitors have visited with your site. This will allow you to learn more about the interests of the audience and be one step ahead of competitors.

    About the CleanTalk service

    CleanTalk is a cloud service to protect websites from spambots. CleanTalk uses protection methods that are invisible to the visitors of the website. This allows you to abandon the methods of protection that require the user to prove that he is a human (captcha, question-answer etc.).

  • Feature update for spam comment management in WordPress

    Feature update for spam comment management in WordPress

    We launched the update for possibilities to manage spam comments.

    The new option “Smart spam comments filter” divides all spam comments into Automated Spam or Manual Spam.

    For each comment, the service calculates probability — was this spam comment sent automatically or was it sent by a human.

    All automatic spam comments will be deleted permanently without going to the WordPress backend except for comments with Stop-Words. Stop-Word comments will be always stored in the “Pending” folder. Both blocked and banned comments can be seen in the Anti-Spam Log.

    To manage the actions with spam comments, go to the Control Panel, select the website you want to change the actions for and go to “Settings” under the name of the website. On the website settings page, select the desirable item from the “SPAM comment action” menu and click “Save” button at the bottom of the page.

  • New features for spam comments management on WordPress

    New features for spam comments management on WordPress

    For WordPress users of the service, we have added the new possibilities to manage spam comments.
    By default, all spam comments are placed in the spam folder, now you can change the way the plugin deals with spam comments:

    1. Move to Spam Folder. You can prevent the proliferation of spam folder. It can be cleaned automatically using the option “Keep spam comments for 15 days.” Enable this option in the settings of the plugin: WP Dashboard-Settings-Anti-Spam by CleanTalk->

    2. Move to Trash. All spam comments will be placed in the folder “Trash” in the WordPress Comments section except comments with Stop-Words. Stop-Word comments will be always stored in the “Pending” folder.

    3. Ban comments without moving to WordPress Backend. All spam comments will be deleted permanently without going to the WordPress backend except comments with Stop-Words. Stop-Word comments will be always stored in the “Pending” folder. What comments were blocked and banned can be seen in the Anti-Spam Log.

    To manage the actions with spam comments, go to the Control Panel, select the website you want to change the actions for and go to “Settings” under the name of the website. On the website settings page, select the item from the “SPAM comment action:” the necessary settings and click “Save” button at the bottom of the page.

  • Exotic HTTP headers

    Hello! This article will illustrate the result of applying some important and exotic HTTP headers, most of which are related to security.

    X-XSS-Protection

    Attack XSS (cross-site scripting) is a type of attack in which malicious code can be embedded in the target page.
    For example like this:

    <h1>Hello, <script>alert('hacked')</script></h1>

    This type of attacks easy to detect and the browser may handle it: if the source code contains part of the request, it may be a threat.

    And the title X-XSS-Protection manages the behavior of the browser.

    Accepted values:

    • 0 the filter is turned off
    • 1 filter is enabled. If the attack is detected, the browser will remove the malicious code.
    • 1; mode=block. The filter is enabled, but if the attack is detected, the page will not be loaded by the browser.
    • 1; report=http://domain/url. the filter is enabled and the browser will clear the page from malicious code while reporting the attempted attack. Here, we use a function Chromium for reporting violation of content security policy (CSP) to a specific address.

    Create a web server sandbox on node.js to see how it works.

    
    var express = require('express')
    var app = express()
    app.use((req, res) => {
     if (req.query.xss) res.setHeader('X-XSS-Protection', req.query.xss)
    res.send(`<h1>Hello, ${req.query.user || 'anonymous'}</h1>`)
    })
    
    app.listen(1234)
    
    

    I will use Google Chrome 55.

    No title
    http://localhost:1234/?user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E

    Nothing happens, the browser will successfully block the attack. Chrome, by default, blocks the threat and reported it to the console.

    It even highlights the problem area in the source code.

    X-XSS-Protection: 0

    http://localhost:1234/?user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E&xss=0

    Oh no!

    X-XSS-Protection: 1

    http://localhost:1234/?user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E&xss=1

    Page was cleared because of the explicit title.0

    X-XSS-Protection: 1; mode=block

    http://localhost:1234/?user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E&xss=1;%20mode=block

    In this case, the attack will be prevented by blocking the page load.

    X-XSS-Protection: 1; report=http://localhost:1234/report

    http://localhost:1234/?user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E&xss=1;%20report=http://localhost:1234/report

    The attack is prevented and a message is sent to the appropriate address.

    X-Frame-Options

    With this title you can protect yourself from the so-called Clickjacking.

    Imagine that the attacker has a channel on YouTube and he wants more followers.

    He can create a page with a button “Do not press”, which would mean that everyone will click on it necessarily. But over the button is completely transparent iframe and in this frame hides the channel page with the subscription button. Therefore, when you press the button, in fact, a user subscribes to a channel, unless of course, he was logged into YouTube.

    We will demonstrate that.

    First, you need to install the extension to ignore this header.

    Create a simple page.

    
    <style>
    button { background: red; color: white; padding: 10px 20px; border: none; cursor: pointer; }
    iframe { opacity: 0.8; z-index: 1; position: absolute; top: -570px; left: -80px; width: 500px; height: 650px; }</style>
    
    <button>Do not click his button!</button>
    <iframe src="https://youtu.be/dQw4w9WgXcQ?t=3m33s"></iframe>
    

    As you can see, I have placed the frame with the subscription right over the button (z-index: 1) and so if you try to click it, you actually press the frame. In this example, the frame is not fully transparent, but it can be fixed with the value of opacity: 0.

    In practice, this doesn’t work, because YouTube set the desired heading, but the sense of threat, I hope, is clear.

    To prevent the page to be used in the frame need to use the title X-Frame-Options.

    Accepted values:

    • deny not load the page at all.
    • sameorigin not load if the source is not the same.
    • allow-from: DOMAIN you can specify the domain from which the page can be loaded in a frame.

    We need a web server to demonstrate

    var express = require('express')
    
     
    for (let port of [1234, 4321]) {
     var app = express()
    app.use('/iframe', (req, res) => res.send(`<h1>iframe</h1><iframe src="//localhost:1234?h=${req.query.h || ''}"></iframe>`))
    app.use((req, res) => {
      if (req.query.h) res.setHeader('X-Frame-Options', req.query.h)
    res.send('<h1>Website</h1>')
    })
    app.listen(port)
    }
    

    No title

    Everyone will be able to build our website on localhost:1234 in the frame.

    X-Frame-Options: deny

    The page cannot be used at all in the frame.

    X-Frame-Options: sameorigin

    Only pages with the same source will be able to be built into the frame. The sources are the same, if the domain, port and protocol are the same.

    X-Frame-Options: allow-from localhost:4321

    It seems that Chrome ignores this option, because there is a header Content-Security-Policy (about it will be discussed below). It does not work in Microsoft Edge.

    Below Mozilla Firefox.

    X-Content-Type-Options

    This header prevents attacks spoofing MIME type (<script src=”script.txt”>) or unauthorized hotlinking (<script src=”https://raw.githubusercontent.com/user/repo/branch/file.js”>)

    
    var express = require('express')
    var app = express()
    
    app.use('/script.txt', (req, res) => {
      if (req.query.h) res.header('X-Content-Type-Options', req.query.h)
    res.header('content-type', 'text/plain')
    res.send('alert("hacked")')
    })
    
    app.use((req, res) => {
    res.send(`<h1>Website</h1><script src="/script.txt?h=${req.query.h || ''}"></script>`
    })
    app.listen(1234)
    

    No title

    http://localhost:1234/

    Though script.txt is a text file with type text/plain, it will be launched as a script.

    X-Content-Type-Options: nosniff

    http://localhost:1234/?h=nosniff

    This time the types do not match and the file will not be executed.

    Content-Security-Policy

    It is a relatively new title and helps to reduce the risks of XSS attacks in modern browsers by specifying in the title what resources can be loaded on the page.

    For example, you can ask the browser do not execute inline-scripts and download files only from one domain. Inline-scripts can look not only like <script>…</script>, but also as <h1 onclick=”…”>.

    Let’s see how it works.

    
    var request = require('request')
    
    var express = require('express')
    
     
    
    for (let port of [1234, 4321]) {
    
    var app = express()
    
    app.use('/script.js', (req, res) => {
    
    res.send(`document.querySelector('#${req.query.id}').innerHTML = 'changed ${req.query.id}-script'`)
    
    })
    
    app.use((req, res) => {
    
    var csp = req.query.csp
    
    if (csp) res.header('Content-Security-Policy', csp)
    
    res.send(`
    
    <html>
    
    <body>
    
    <h1>Hello, ${req.query.user || 'anonymous'}</h1>
    
    <p id="inline">this will changed inline-script?</p>
    
    <p id="origin">this will changed origin-script?</p>
    
    <p id="remote">this will changed remote-script?</p>
    
    <script>document.querySelector('#inline').innerHTML = 'changed inline-script'</script>
    
    <script src="/script.js?id=origin"></script>
    
    <script src="//localhost:1234/script.js?id=remote"></script>
    
    </body>
    
    </html>
    
    `)
    
    })
    
    app.listen(port)
    
    }
    

    No title

    It works as you would expect

    Content-Security-Policy: default-src ‘none’

    http://localhost:4321/?csp=default-src%20%27none%27&user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E

    default-src applies a rule to all resources (images, scripts, frames, etc.), the value ‘none’ disables all. Below is shown what happens and the errors displayed in the browser.

    Chrome refused to run any scripts. In this case, you can’t even upload a favicon.ico.

    Content-Security-Policy: default-src ‘self’

    http://localhost:4321/?csp=default-src%20%27self%27&user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E

    Now it is possible to use the resources from one source but still cannot run external and inline-scripts.

    Content-Security-Policy: default-src ‘self’; script-src ‘self’ ‘unsafe-inline’

    http://localhost:4321/?csp=default-src%20%27self%27;%20script-src%20%27self%27%20%27unsafe-inline%27&user=%3Cscript%3Ealert(%27hacked%27)%3C/script%3E

    This time we let the execution and inline-scripts. Please note that XSS attack in the request was blocked too. But this will not happen if at the same time deliver and unsafe-inline and X-XSS-Protection: 0.

    Other values

    On the website, content-security-policy.com beautifully had shown many examples.

    • default-src ‘self’ allowed resources only from one source
    • script-src ‘self’ www.google-analytics.com ajax.googleapis.com allow Google Analytics, Google AJAX CDN, and resources from one source.
    • default-src ‘none’; script-src ‘self’; connect-src ‘self’; img-src ‘self’; style-src ‘self’; allow images, scripts, AJAX and CSS from one source and prohibit the downloading of any other resources. For most sites this is a good initial setting.

    I didn’t check, but I think that the following headers are equivalent:

    • frame-ancestors ‘none’ and X-Frame-Options: deny
    • frame-ancestors ‘self’ and X-Frame-Options: sameorigin
    • frame-ancestors localhost:4321 and X-Frame-Options: allow-from localhost:4321
    • script-src ‘self’ without ‘unsafe-inline’ and X-XSS-Protection: 1

    If you look at the headers facebook.com or twitter.com it is possible to notice that these sites use a lot of CSP.

    Strict-Transport-Security

    HTTP Strict Transport Security (HSTS) is a mechanism for security policy, which helps protect the website from attempts by an unsecured connection.

    Let’s say that we want to connect to facebook.com. If you don’t specify before requesting https://, protocol, by default, will be selected HTTP and therefore the request will look like http://facebook.com.

    
    $ curl -I facebook.com
    HTTP/1.1 301 Moved Permanently
    Location: https://facebook.com/
    

    After that, we will be redirected to the secure version of Facebook.

    If you connect to a public WiFi hotspot, which is owned by the attacker, the request may be intercepted and instead facebook.com the attacker may substitute a similar page to know the username and password.

    To guard against such an attack, you can use the aforementioned title that will tell the client the next time to use the https-version of the site.

    
    $ curl -I https://www.facebook.com/
    HTTP/1.1 200 OK
    Strict-Transport-Security: max-age=15552000; preload
    

    If the user was logged into Facebook at home and then tried to open it from an unsafe access point, he is not in danger, because browsers remember the title.

    But what happens if you connect to the unsecured network first time? In this case, the protection will not work.

    But browsers have a trump card in this case. They have a predefined list of domains for which should be used HTTPS only.

    You can send your domain at this address. It is also possible to find out whether the header is used correctly.

    Accepted values:

    • max-age=15552000 the time in seconds that the browser should remember the title.
    • includeSubDomains If you specify this optional value, the header applies to all subdomains.
    • preload if the site owner wants the domain got into a predefined list that is supported by Chrome (and used by Firefox and Safari).

    And if you need to switch to HTTP before the expiration of max-age or if you set preload? You can put the value max-age value=0 and then the navigation rule to the https version will stop to work.

    Public-Key-Pins

    HTTP Public Key Pinning (HPKP) is a mechanism for security policy that allows HTTPS sites to protect against malicious use of fake or fraudulent certificates.

    Accepted values:

    • pin-sha256=”<sha256>” in quotes is encoded using Base64 thumbprint of the Subject Public Key Information (SPKI). You can specify multiple pins for different public keys. Some browsers in the future may use other hashing algorithms besides SHA-256.
    • max-age=<seconds> the time, in seconds, that for access to the site need to use only the listed keys.
    • includeSubDomains if you specify this optional parameter, the title applies to all subdomains.
    • report-uri=”<URL>” if you specify URL, then when a validation error key, the corresponding message will be sent to the specified address.

    Instead of the title Public-Key-Pins, you can use Public-Key-Pins-Report-Only, in this case, it will only send the error messages to match the keys, but the browser will still load the page.

    So does Facebook:

    
    $ curl -I https://www.facebook.com/
    
    HTTP/1.1 200 OK
    
    ...
    
    Public-Key-Pins-Report-Only:
    
    max-age=500;
    
    pin-sha256="WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18=";
    
    pin-sha256="r/mIkG3eEpVdm+u/ko/cwxzOMo1bk4TyHIlByibiA5E=";
    
    pin-sha256="q4PO2G2cbkZhZ82+JgmRUyGMoAeozA+BSXVXQWB8XWQ=";
    
    report-uri="http://reports.fb.com/hpkp/"
    

    Why is it necessary? Not enough of trusted certification authorities (CA)?

    An attacker can create a certificate for facebook.com and by tricking the user to add it to your list of trusted certificates, or it can be an administrator.

    Let’s try to create a certificate for facebook.

    
    sudo mkdir /etc/certs
    
    echo -e 'US\nCA\nSF\nFB\nXX\nwww.facebook.com\nn*@**am.org' | \
    
    sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 \
    
    -keyout /etc/certs/facebook.key \
    
    -out /etc/certs/facebook.crt
    

    And make it trusted in the local system.

    
    # curl
    
    sudo cp /etc/certs/*.crt /usr/local/share/ca-certificates/
    
    sudo update-ca-certificates
    
    # Google Chrome
    
    sudo apt install libnss3-tools -y
    
    certutil -A -t "C,," -n "FB" -d sql:$HOME/.pki/nssdb -i /etc/certs/facebook.crt
    
    # Mozilla Firefox
    
    #certutil -A -t "CP,," -n "FB" -d sql:`ls -1d $HOME/.mozilla/firefox/*.default | head -n 1` -i /etc/certs/facebook.crt
    

    Now run the web server using this certificate.

    
    var fs = require('fs')
    
    var https = require('https')
    
    var express = require('express')
    
     
    
    var options = {
    
    key: fs.readFileSync(`/etc/certs/${process.argv[2]}.key`),
    
    cert: fs.readFileSync(`/etc/certs/${process.argv[2]}.crt`)
    
    }
    
     
    
    var app = express()
    
    app.use((req, res) => res.send(`<h1>hacked</h1>`))
    
    https.createServer(options, app).listen(443)
    

    Switch to the server

    
    echo 127.0.0.1 www.facebook.com | sudo tee -a /etc/hosts
    
    sudo node server.js facebook
    

    Let’s see what happened

    
    $ curl https://www.facebook.com
    
    <h1>hacked</h1>
    

    Great. curl validates the certificate.

    So as I already went to Facebook and Google Chrome has seen its headers, it should report the attack but to allow the page, right?

    Nope. Keys are not checked because of local root certificate [Public key pinning bypassed]. This is interesting…

    Well, and what about www.google.com?

    
    echo -e 'US\nCA\nSF\nGoogle\nXX\nwww.google.com\nn*@**am.org' | \
    
    sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 \
    
    -keyout /etc/certs/google.key \
    
    -out /etc/certs/google.crt
    
    sudo cp /etc/certs/*.crt /usr/local/share/ca-certificates/
    
    sudo update-ca-certificates
    
    certutil -A -t "C,," -n "Google" -d sql:$HOME/.pki/nssdb -i /etc/certs/google.crt
    
    echo 127.0.0.1 www.google.com | sudo tee -a /etc/hosts
    
    sudo node server.js google
    

    The same result. I think this is a feature.

    But in any case, if you do not add these certificates to the local store, open websites will not work because the option to continue with an insecure connection in Chrome or add an exception in Firefox will not.

    Content-Encoding: br

    Data is compressed with Brotli.

    The algorithm promises better compression than gzip and comparable speed unzipping. Supports Google Chrome.

    Of course, there is a module for in node.js.

    
    var shrinkRay = require('shrink-ray')
    
    var request = require('request')
    
    var express = require('express')
    
     
    
    request('https://www.gutenberg.org/files/1342/1342-0.txt', (err, res, text) => {
    
    if (err) throw new Error(err)
    
    var app = express()
    
    app.use(shrinkRay())
    
    app.use((req, res) => res.header('content-type', 'text/plain').send(text))
    
    app.listen(1234)
    
    })
    

    Original size: 700 KB

    Brotli: 204 KB

    Gzip: 241 KB

    Timing-Allow-Origin

    Using the Resource Timing API, you can find out how much time took the processing of resources on the page.

    Because the information of load-time may be used to determine whether the user visited the page before this (paying attention to the fact that resources can be cached), a standard is considered to be vulnerable, if you give this information to any hosts.

    
    <script>
    
    setTimeout(function() {
    
    console.log(window.performance.getEntriesByType('resource'))
    
    }, 1000)
    
    </script>
    
     
    
    <img src="http://placehold.it/350x150">
    
    <img src="/local.gif">
    

    It seems that if you do not specify Timing-Allow-Origin, then get detailed information about the time of the operations (the search domain, for example) is possible only for resources with one source.

    You can use this:

    • Timing-Allow-Origin: *
    • Timing-Allow-Origin: http://foo.com http://bar.com

    Alt-Svc

    The Alternative Services allow resources to be in different parts of the network and access to them can be obtained using different configurations of the protocol.

    This is used in Google:

    • alt-svc: quic=”:443″; ma=2592000; v=”36,35,34″

    This means that the browser, if it wish, can use the QUIC, it is HTTP over UDP, over port 443 the next 30 days (ma = 2592000 seconds, or 720 hours, i.e. 30 days). I have no idea what means the parameter v, version?

    P3P

    Below are some P3P headers that I have seen:

    • P3P: CP=«This is not a P3P policy! See support.google.com/accounts/answer/151657?hl=en for more info.»
    • P3P: CP=«Facebook does not have a P3P policy. Learn why here: fb.me/p3p»

    Some browsers require that cookies of third parties supported the P3P protocol for designation of confidentiality measures.

    The organization, founded P3P, the world wide web Consortium (W3C) halted work on the protocol a few years ago due to the fact that modern browsers don’t end up to support protocol. As a result, P3P is outdated and does not include technologies that are now used in a network, so most sites do not support P3P.

    I didn’t go too far, but apparently the header is needed for IE8 to accept cookies from third parties.

    For example, if IE privacy settings are high, then all cookies from sites that do not have a compact privacy policy will be blocked, but those who have headlines similar to the above, will not be blocked.

    Which of the following HTTP headers You use in projects?

    X-XSS-Protection
    X-Frame-Options
    X-Content-Type-Options
    Content-Security-Policy
    Strict-Transport-Security
    Public-Key-Pins
    Content-Encoding
    Timing-Allow-Origin
    Alt-Svc
    P3P
    Other

    This text is a translation of the article “Экзотичные заголовки HTTP”  published by @A3a on habrahabr.ru.

    About the CleanTalk service

    CleanTalk is a cloud service to protect websites from spambots. CleanTalk uses protection methods that are invisible to the visitors of the website. This allows you to abandon the methods of protection that require the user to prove that he is a human (captcha, question-answer etc.).

  • New anti-spam checks for WordPress, XenForo, phpBB 3.1, SMF, Bitrix

    We are pleased to announce that we have released new versions of plugins for WordPress, XenForo, phpBB 3.1, SMF, Bitrix.

    In the new version, we have added some new checks for spam to improve anti-spam service.

    Mouse tracking and Time zone monitoring give good results against spam bots which simulate the behavior of real visitors.

    These checks for other CMS will be added soon.

    Please, update your anti-spam plugins for latest version:

    WordPress
    XenForo
    phpBB 3.1
    Simple Machines Forum
    Bitrix

  • New version of the Security Service by CleanTalk

    New version of the Security Service by CleanTalk

    As we informed CleanTalk launched its website security project. The service protects administrator control panel from brute-force attacks and records users’ actions.

    Since the 29th of November Security by CleanTalk has become the Cloud Service and now all main data will be available in The Service Dashboard. The cost of the service is $20 per year for 1 website.

    Switching to Cloud Data Storage allows to show more data and use the information more flexible thanks to different filters in your Dashboard.

    In the previous versions all data were being stored in a website database and big amount of information alongside with its operations would affect website speed, all this could give a result of bad website ranking of search engines. Cloud Data Storage is safer than website database. If an intruder could get access to your website then he could delete all data he might be traced with.

    Cloud Service provides data storage for the last 45 days including users action log, brute-force attacks statistics and successful backend logins and you can always get to know who and what actions were made if it is necessary.

  • Breeding Business: from ordinary blog to extraordinary magazine

    Geek at heart, I always have been coding littles projects on localhost and a few failing websites. I guess I never really took Internet seriously.

    Then, I realized these jobs I was doing in luxury hospitality were not making me happy. I just loved coming back home and writing, developing and designing. It’s just what I love. So I started looking at opportunities to generate a very small income that could make a website sustainable. And I had zero money to invest.

    Over the last years, WordPress and blogging have been a huge hit and a lot of people go for it. They think about the monetization before having thought of their content, I took it the other way around.

    Why Blogging About Dog Breeding?

    When I set my mind to start an online blog, I looked at the usual ways of finding the perfect “keyword”, “topic”, “niche”. These include Google Keyword Planner, Google Trends and some paying softwares. I managed to have three topics that seemingly were searched for and that I was happy to write posts on.

    Then, I picked the best topics and started writing. And this is when I realized I couldn’t write on anything else than what I truly loved — responsible and ethical dog breeding. I was writing one article after another. It just felt right.

    Breeding dogs is something that has been running through several generations in my family and although I haven’t done it extensively myself, I am passionate by the canine genetics and mechanisms that make you have the best bloodline of all.

    Dog breeding is a passion of mine and it would be hard for me not to write about it.

    What Is Breeding Business?

    Breeding Business was born after I wrote a few articles. I was going on Facebook Groups at the time to promote my articles (and eventually got suspended!) because Google wasn’t sending me enough traffic at first.

    The website consists of a lot of articles written and published in different categories: how-to’s, interviews of breeders, reviews of dog breeding supplies, and obviously in-depth articles on how to breed dogs.

    After just a few weeks, some visitors started asking what books were we recommending. Unfortunately most books are either too narrow in their topics or too breed-specific. A dog is a dog and the principles remain the same for a Chihuahua or a Rottweiler.

    Therefore, we created our very own ebook, The Dog Breeder’s Handbook. It was created on iBooks Author since it’s a free application built by Apple and at the time, I didn’t know if the ebook was going to be a hit, or a miss. I like to be in motion, try things and if they fail, move on to the next one.

    The Dog Breeder’s Handbook offers all the theoretical knowledge dog breeders need and a lot of actionable tips for them to put into practice. Yet, the launch was slow because the traffic was low. It was definitely generating a few hundred dollars every month. This is what kept me going and made me believe in it even more.

    From then on, I thought I was going to add another product many visitors were hinting at: a WordPress plugin for dog breeders. I built it in few weeks and it is today a very good seller. I release updates using the feedback loop and have a similar project to be released soon.

    Challenges When Growing a Simple Blog Into an Online Magazine

    Being alone and seeing the traffic (and revenue) growing, questions start to pop in your mind.

    It’s time for some business decisions

    A blogger and solo-entrepreneur always strives for steady growth. I do not identify myself with mega-growth startups we read about everywhere. To each their own!

    With Breeding Business, the growth has been great especially since Google sent traffic our way. No specific strategy that we followed, we just put out great content. Often.

    Yet, we’re still asking ourselves a million of questions…

    • Should I add another product or should I focus and grow these?
    • Communities around blogs are hype, should I make one?
    • Is the traffic growth normal or too slow?
    • Subscriptions are so popular these days, but what to offer?

    These are business decisions to make. I added another product: a course. It never took off mainly because it was kind of duplication what was in the ebook. We’re thinking a new use for courses for the future because I could see people were interested.

    Communities are great but there is nothing worse than a dead forum so we never took that risk and are waiting to have a bigger email list to perhaps one day launch a community. Subscriptions are great but just not for us right now. A lot of blogs start charging a monthly or yearly fee for members to be part of a special club but most of them see a huge churn and give that model up after a few months.

    Growth requires a technical overhaul, too

    Our traffic has been growing very well thanks to search engines. This is why we needed a quality anti-spam and CleanTalk has been doing a sublime job at keeping these fake user accounts and comments away.

    With traffic growth comes a whole new set of interrogations:

    • Why am I not converting more visitors into optins or customers?
    • GTmetrix and page speed tests are giving me low scores, how can I optimize my website?
    • Why so many people read one article and leave?

    These are technical issues that truly take time to be fixed. There are mainly two ways we could tackle these:

    1. Patch each little issue one by one
    2. Build a brand new website from scratch with these issues factored in

    After a few months, we were patching issues one by one but today, I am almost finished with a brand new version of the website to be released in two or three months after extensive testing. We’re also pairing that new website with a move from cloud hosting to a VPS (ten folding the monthly hosting cost…)

    Restructure the tree of information

    Our current website was up and running when we had around 20-30 articles. We have over 300 articles today. People aren’t visiting other pages because the information is badly structured and they can’t find their way around.

    Categories are being completely revamped. Stuff we thought was going to attract a lot of people, ended up being a graveyard and vice versa. So we’re cleaning the way the posts are categorized and tagged while updating old pages as well.

    Speed and page load

    Google is apparently using your website’s loading speed as a signal to decide on your ranking. My website is currently performing very poorly in terms of page load speed.

    And these results are after several fixes here and there. So it’s the second main focus for the update. We’re also making sure the website loads much much faster on mobile devices thanks to wp_is_mobile(), the WordPress function to detect mobile devices. We load lower-quality images, less widgets.

    Another WordPress optimisation is the use of the Transients API for our most repeated and complicated queries such as our top menu, footer, home queries, related posts, etc. The way it works is simple and allows you to store cached data in the database temporarily. Instead of retrieving the full menu at each page load, using a transient only requires a single database call for the menu to be fetched.

    Add new UX features

    The new version of Breeding Business brings its own set of new UX features. More AJAX calls, less page refreshes. More white spaces and an easier scroll through our entire page. We’ve also decluttered the article’s footer so our calls to action can jump to my visitors’ eyes.

    Conclusion is… One man can only do so much!

    Everything is wrote here is what I do daily. Article writing, support emails, plugin updates, website updates, email outreach, designing illustrations, social media promotions, bookkeeping and accounting, strategizing and long-term planning, etc. And I’m not helping myself by adding a new recurring item to our new upcoming version: biweekly giveaways!

    Over the last weeks, I realized how stupid it is to rely on your own self only. It’s self-destructive and counterproductive. I genuinely believe that delegating any of these tasks will result in a loss of quality and will cost me money.

    Yet, I have to leave my ego at the door and put some faith in other people. Sure, I may work with some disappointing people at first but it is also my duty to teach them how I want them to work.

    This is my focus for 2017 — learn how to surround myself with the right people (or person) to free some time for me to focus on what I do best.

     

    About the author

    Lazhar is the founder of Breeding Business, a free online magazine educating responsible dog breeders all around the world through in-depth dog breeding articles, interviews, ebooks and comprehensive guides.

  • What is AMP (Accelerated Mobile Pages)? How to setup CleanTalk for AMP

    What is AMP?

    Accelerated Mobile Pages — it’s the tool for static content web-page creation with almost instant load for mobile devices. It consists of three parts:

    1. AMP HTML — it’s HTML with limitations for reliable performance and some extensions for building rich content.
    2. AMP JS — is library which ensures the fast rendering of pages. Third-party JavaScripts are forbidden.
    3. Google AMP Cache — is a proxy-based content delivery network for delivering all valid AMP documents.  It fetches AMP HTML pages, caches and improves page performance automatically.

    Advantages

    • Lightweight version of standard web-pages with high speed load.
    • Instant multimedia content load: videos, animations, graphics.
    • Identical encoding — the same fast rendered website content on different devices.
    • AMP project is open source, it enables free information sharing and ideas contribution.
    • Possible advantage in SEO as page load speed is one of the ranking factors.
    • There are plugins for popular CMS to make AMP usage easier in your website.

    How to use it in WordPress

    When you choose what AMP plugin to use keep in mind the following:

    — Integration with SEO plugin for attaching corresponding metadata.

    — Analytics gathering with traffic tracking of your AMP page.

    — Displaying ads if you are a publisher.

    Available plugins in the WordPress catalog:

    1. AMP by Automattic
    2. Facebook Instant Articles & Google AMP Pages by PageFrog
    3. AMP – Accelerated Mobile Pages
    4. AMP Supremacy
    5. Custom AMP (requires installed AMP by Automattic)

    As example let’s install and activate AMP by Automattic and create a new post with multimedia content. Please, take note that not page but post. Pages and archives are not currently supported.

    AMP by Automattic plugin converts your post into accelerated version of the post automatically and you don’t have to duplicate by yourself. Just add /amp/ (or ?amp=1) to the end of your link and that would be enough.

    How to setup CleanTalk for AMP

    Please, make sure that the option “Use AJAX for JavaScript check” is disabled as it will prevent regular JavaScript execution.

    The option is here:

    WordPress Admin Page —> Settings —> CleanTalk and uncheck SpamFireWall.  

    Then, click on Advanced settings —> disable “Use AJAX for JavaScript check” —> Save Changes.

    Other options will not interrupt AMP post functioning. The CleanTalk Anti-Spam plugin will protect all data sending fields that were rendered after the conversion.

    For now, most AMP plugins remove the possibility to comments and send contact form data on accelerated pages.

    Google validation

    Now you need to validate your website structured data using the tool “Google Validator”:

    https://search.google.com/structured-data/testing-tool/

    If you don’t do this a search bot will not simply pay its attention to your post and no one will see it in the search results.

    Copy and paste the link to your AMP post and see the result. Fix the problems you will be pointed at.

    After that your AMP version of the post will be ready to use.

    Links

    AMP project:
    https://www.ampproject.org/

    AMP blog:
    https://amphtml.wordpress.com/

    AMP plugins in the WordPress catalog:
    https://wordpress.org/plugins/search.php?q=AMP

    Google Search recommendations of how to create accelerated mobile pages:
    https://support.google.com/webmasters/answer/6340290?hl=en