This week has been the week of “what took so long”: Instagram is finally taking an action on fake followers and Firefox is finally addressing its speed issues.
In other discussions of note, members provide tips for how webmasters can control for “dodgy” reviews, Google provides a not-so-clear clarification on duplicate content, and more!
Instagram Clamps Down on Bots and The Companies Running Them
The consensus on Webmaster World seems to be, “what took so long?” .
There’s the “follow the money” and there’s the “brand recognition”, and then there’s “pr by spam” … all detrimental to most communities.
Glad it’s happening. Wondering why it took so long. Maybe they needed the Fake Numbers to bolster their sale to FB?
One member also noted that if Facebook was keeping up with Webmaster World User Agent and Bot ID Forum, they may have done its sooner
Can Firefox 54 revive the browser’s market share?
With the latest update, Firefox promises faster speeds and reduction in memory usage. For those of us wondering, what took so long?
According to the scoop in this article here, it seems that the issue was moving from a single process architecture to a multi-process architecture. Nich Nguyen, Product VP at Firefox, urges former users to give Firefox another chance,
“if you’ve stopped using Firefox, give it a try again.”
Upgrade to HTTPS Absolute vs. Relative Links
In a most recent case of moving to HTTPS, member asks about changing from absolute to relative link references and if this may have a potential impact.
There’s no earthly reason ever to use absolute links for your own site, except in the rare case where some areas are http-only while others are https-only.
That’s assuming when you say “relative links” you mean ones beginning in / or //. If you mean links beginning in ../ then it’s time to have a talk with your host.
Changing references from absolute to relative has been a controversial topic in the past but on this thread, members seem to agree that the relative links are fine, so long as all the steps involved are done correctly.
Cre8asiteforums Member Tam asks about dealing with dodgy reviews on her website, and member earlperl discussing methods used by other providers to control reviews
- filters reviews by people who aren’t active reviewers/yelpers
- filter for potentially inflamatory content such as political reviews
- too many reviews submitted in a short time period
- filters on first time reviewers
- sending in reviews from a review station or filtering too many reviews from one IP
using urls in reviews
Attempting to Define Duplicate Content
In a recent tweet, Google’s Gary Illyes, when asked to define a tweet gave an appropriately short and sweet answer,
“Think of it as a piece of content that was slightly changed, or if it was copied 1:1 but the boilerplate is different”.
Unfortunately, this provides very little guidance for webmasters in terms of threshold, especially for enterprise websites where content is mostly dynamically driven.
Members discuss crawl delay and it is still used. Member NoOneSpecial clarifies that robots.txt is obsolete, since systems are sufficiently advanced to not require it. Clarify specifically in terms of Google, NoOneSpecial says that they ignore it.
The post Instagram On Bots, Dealing With Dodgy Reviews & Google To Define Dupe Content: Weekly Forum Update appeared first on Internet Marketing Ninjas Blog.
from Internet Marketing Ninjas Blog http://ift.tt/2sAY7Ay
via IFTTT
No comments:
Post a Comment