Nicely Displayed Tweets

While looking at how to get a tweet from Twitter nicely formatted, I tried all kinds of tools, web sites and tips. In my case, I wanted to put a nice screenshot of the tweet by Alex Yates into my DevOps lecture slides.

Sharing the tweet via e-mail seemed to be a possibility, but it did not look as nice as I expected.

It turned out, that Twitter provides functionality to nicely display Twitter URLs out of the box via https://publish.twitter.com.

All you have to do is to paste the Twitter link into the text box and everything ios nicely rendered for you.

Link: https://publish.twitter.com/#

NGINX WordPress Rewrites

After moving my blog to its new domain try-catch-finally.net there was one major issue open: Search engines. Google, Bing, DucDuckGo and whatever have their indices. Eventually, I want to make sure when you hit one of the search results, you will end up with the proper site.

Using NGINX allows you to do this with a few files. Using the location section let you match against path segments and applying a rewrite rule.

The trick is done by the two parameters $1 and $1. While the $1 is the content in the first paratheses, $2 is the rest of the path segment from your request which is in the second paratheses. Once I got this pattern, It was easy to write the below rule.

server {   
  ...   
  location location ~ /(2004|2005|2006|2007|2008|2009|2010|2011|2012|2013|2014|2018|2019|feed|comments|tag|author|category)/(.*) {
    return 301 https://www.hack-the-planet.net/$1/$2;     
  }   
  ... 
}

In addition to just forwarding your request, the client will receive the HTTP status code 301, that way there is a good chance search engines get the information about the change for this particular URL.

Google XML Sitemaps

I started to learn a bit about Google Webmaster Tools and how to increase the findability of one’s website. First of all, I was looking for a flexible sitemap generator for WordPress where I ended up with Google XML Sitemaps.

Google XML SItemap Plugin

This plugin generates a sitemap file which can be consumed by search engines like Google or Bing. To do so, you have to verify your website with the various providers. Usually, this is done by adding some meta tags to your HTML pages to prove you have full control over the server. Currently, this can be achieved by WordPress’ Jetpack. That way you don’t have to fiddle with the header.php file of your WordPress installation.

WordPress Jetpack Site verification

You don’t have to, but you can sign into the Google Webmaster Tools to check the verification status of your site.

Bing Webmaster Center

Also Bing Webmaster Center will provide you a Meta tag you can provide Jetpack to verify your site and to improve the discoverability of your site.

Usable Google Reader Alternative

I haven’t used feed readers a lot after Google Reader was cancelled on March 13, 2013. Before I made extensive use of it with quite a lot of resources. Personal weblogs, article feeds and so on. Once Google cancelled Reader it seemed many applications and sites stopped supporting RSS and Atom as they did before. I am not saying they did not provide feeds anymore, however, it seemed that feeds haven’t been first class citizen of the Web anymore.

Since my personal focus recently changed towards new and actual technologies (again), I started to read a lot more online articles as I did during the recent six years. So once again, I am looking for an alternative to Google Reader (again).

Feedly never seemed to work for me due to various reasons, including a monthly fee of 5$. But recently I found Inoreader to be very handy. So far, I was not aware of this service.

First of all, it is a Freemium service as well, however, with the free membership it seems you get all basic functionality needed. The very only limitation seems to be a limit of 150 sources. Once reached you can upgrade to 20$ yearly plan to set a new limit of 500 sources. This seems a absolute reasonable price I am willing to pay for this service.

Inoreader Screenshot

So I will give it a try to keep up to date with my favourite sites, looking forward to hit the mark of 150 sources soon.

Link: https://www.inoreader.com/

Monitoring your Site with Apex Ping

While I run meanwhile quite a number of websites, blogs and other services, I was looking for monitoring possibilities – not running on the server I am just monitoring.

I was pointed to Apex Ping, which is a simple and beautiful monitoring of various features of your website.

Apex Ping Homepage

While it is a bit pricey for my use case at the moment, it is a really nice service one can consider for a non invasive monitoring of your sites.

Link:
https://apex.sh/

Have You Been Pwned?

Another way to figure out if your account and corresponding account data is https://haveibeenpwned.com/

Have I Been Pwned 

Beside the web form an RESTful API is provided to check automatically. Right now 6,474,028,664 accounts are listed from about 340 hacked websites. Also a list of the breaches, the data comes from is provided. All together it is an easy way to check if your digital identity was recently stolen.

Collection #n

After Collection #1 it did not took long until additional sets fo leaked account and password information appeared. Meanwhile there are Collection #2 to Collection 5. 

All together there are more than 8,000,000,000 are meanwhile leaked. While I accept and actually think of systems being hacked at one point – remember it is not about the if, it is about the when – I cannot understand how actual passwords are stored. 

As I did design a large multi-user system some years ago, we did not save clear text passwords in the system. We actually did even not transport the password from the client to the server in plain text. Said that, I still try to image how anyone could even think of storing passwords in plaintext. 

If you are interested, if any of your password are leaked, you probably should check theIdentity Leak Checker service provided by the Hasso-Plattner-Institute

HPI Password Leak Checker

I actually checked three mail addresses I usually use to sign in at various services.

Leaked Passwords #1

As this is a mail address I don’t use to sign in at public services a lot, the result was not very surprising. Actually, that was I found an account to delete. For my second account this does not look that well. The mail address (and probably passwords) appear in Collections #1 to 2. 

Leakd Passwords #2

The same actually is true for my third and last address I do use for public services. 

Leaked Passwords #3

While I do reset passwords from time to time, it still is worrying that so many passwords have been leaked. I probably will change some passwords of my major accounts as well as I will delete some accounts I really won’t use anymore – or even have never used such as a MySpace account, I completely forgot about.

That way, the HPI Identity Leak Checker might help also to figure about forgotten accounts worth closing. 

 

There it Goes – Google Reader Gone for Good

Icon by http://icontexto.blogspot.de/  via Creative Commons Attribution Non-commercial Share Alike (by-nc-sa)First headline in this morning’s news: Goggle stopps Google Reader (please bear in mind, this link won’t work anymore in the future). Google wants to power down the Google Reader among other APIs and services since the 2011 spring clean. Personally, I am affected the third time by Google’s cut downs after the Feedburner burnout last year.

While I was annoyed in the very first moment, I had to think through various perspectives, not just coming up with yet another rant post about Google’s attitudes.

The Business Point of View
Google is not doing anything wrong (I guess) from a business point of view. They simply cut down projects, teams or cost centers with no or little revenue. I have seen this several times during my time at Microsoft where teams or studios where shut down due to a revenue not meeting the expectations. Larry Page wanted to focus on core products and  less speculative projects which does make sense considering the shareholders beyond Google. Consequently, cutting down free services not being paid for, requiring manpower for development an maintenance and (not to underestimate) bare metal down in Google’s data centers is a plan to increase revenues, cut down losses and save not to spend money.

The User ‘s Point of View
As a user, you might rely on these services. Maybe you build up your website based on various Google APIs (as they have been free), you maintained you RSS feed in Google Reader and so on. Even with several weeks of notice, you need to change technologies, maybe rebuild or recode you page, and even worse to change habits. At some point in time, after this happened one, two or three times (depending on your very personal potential to suffer).

The Developer’s Point of View
There are quite many apps, tools and pages out there heavily depending or based on Google’s API including Google Reader. Not only their apps and tools stop working, also users who bought these products will be forced to stop using these tools. With feedly, there is timely an alternative Reader and with Normandy developers get an API they might use for their products. However, Nick Bradbury already announced to stop working on the Windows client FeedDemon which heavily depends on the Google API for synchronization.More will definitely follow…

The Consequences
As developer, I was affected once before, as user I am affected the second time by now. By cutting down both services I am left with Google Calendar. While Google might or might not continue this service in the future, one might rethink if using it is a good choice. Keep in mind, we do not pay for it as users and the Google App Sync meanwhile is only available for business users (probably paying for it). Google Calendar Sync was a great tool to sync between Outlook and Google Calendar. I fought my way through the setup using Windows 7 three years ago right after they stopped development for it.

There is already an petition for keeping Google Reader alive, supported by more than 35,000 users (nothing compared to he 10 Mio user susing G+ stated by Larry Page). Still chances that Google will continue the service are less than probably.

The Business Point of View Revisited
I wonder if Google thought of charging for these services. I wonder if one (e.g. I) would pay for such a service. It definitely would depend on the amount they would charge. A few bucks a year won’t hurt and with a few ten thousands of users they might pay the bills for this service one might think. On the other hand, a company like Google might not be interested in any service with less than ten million $$$ of revenue (please put in whatever amount you think is suitable) or a million of users…

Fixing ASP.NET MVC 4 Web API 404

For a Web Service providing some REST-style URIs to access the data, I decided to use the ASP.NET MVC 4 Web API. Once developed, tested and deployed I experienced a mysterious 404 on my production server.

ASP.NET Web API 404

The Web API started originally as WCF Web API at CodePlex and is finally fully integrated within the latest .NET framework:

“ASP.NET Web API represents the joint efforts of the WCF and ASP.NET teams to create an integrated web API framework. You can get the bits and find articles, tutorials, samples and videos on the new ASP.NET Web API home page. All you have to do is to..”

The tutorials and examples for the ASP.NET Web API are overall easy to understand and you probably get  access to the technology very quickly. After I set up my first Web API, which worked absolutely perfect on Windows 8, developed using Visual Studio 2012 and tested with the IIS Express, I was not able to get the bits executed on the deployment server. It’s a Windows Server 2008 R2, IIS 7.5 and a whole bunch of stuff installed using the Web Platform Installer.

Make sure the .NET Framework is installed, probably you missed to install the 4.5 framework on the deployment server. As IIS is set up already, once again it is necessary to register ASP.NET for the latest framework using

C:\Windows\Microsoft.NET\Framework\v4.0.30319>aspnet_regiis.exe -i

Even now, I got the 404. Eventually, I got the tip to check out how the routing of extensionless URLs work in ASP.NET. By adding

<system.webServer>
    <modules runAllManagedModulesForAllRequests="true" />
    ...
</system.webServer>

to the web.config file of my Web API  the routing seems to work fine now.

How to get the favicon.ico from any Page

Recently, I was in the need of retrieving the favicon.ico file from a website. As I had to process the file programmatically and render it on a website, it would have been quite a lot of manual work to get the .ico file and make sure the browser does render it in the correct way. After digging around, I learned about a secret URI probably provided once by Google’s social bookmarking service Google Shared Stuff. While Google Shared Stuff was launched in 2007,  it was already discontinued in 2009. However, this one URI seems to work perfectly maybe because it is still used within Google extensively.

The Secret

To get the favicon.ico file from any arbitrary page you simply have to use an URI using the following pattern:

http://www.google.com/s2/favicons?domain=www.example.org

Eventually, this URI will provide you the following image: Image retrieved using http://www.google.com/s2/favicons?domain=www.example.org

How it Works

Some More examples to see how it works:

  • Facebook
  • TechChrunch
  • aheil blog
  • Google
  • dotnetpro Magazine
  • heise.de
  • Google+

As most of the sites do keep their favicon.ico file right in the root of the web site, others like Google don’t. Actually, you might find Google’s plus icon located at

https://ssl.gstatic.com/s2/oz/images/faviconr2.ico

While this is probably not a problem retrieving the favicon.ico file using the standard URI at all, the secret URI provides one major advantage: you’ll get the icon as a nice 16×16 PNG file, ready to be rendered in any <img> tag right away.

The Risk

As every time building up on a Google service as I did before, it might disappear tomorrow without notice leaving your site with quite a bunch of 404s though. Even worse, as it seems there is no official support for this URI, there won’t be any notice or deprecation period until switched of as done for other services like Feedburner.