Общо показвания

декември 29, 2013

The problem with on-line apps

Okay, I have to admin on-line web apps are absolutely great. You can use any computer and if you remember your password you can use the tool from anywhere in the world, you don't have to bring your own laptop, you don't have to worry about backups and disk failures and you are always using the latest and greatest version of the software. Isn't it great!


Well, it is great for as long as it lasts....


One after another companies and start-up fail or are being bought and/or merges with another (mostly commercial) product and with a little to no notice are being closed down. And it happens very often, more often than you might imagine.

Lets take Google reader for example. What a great service it was. And it closed down, regardless of the protests, regardless of the on-line groups opposing it. Vuru - financials service also closes now with only a month notice. Half of the applications I have ever installed in Chrome do no longer exist. Half! And the other half are giants like Facebook, Google mail etc. And even they can decide to close down at any given moment. As soon as they decide that it is no longer relevant or profitable there is nothing from stopping the company from closing down the doors and windows and there is no way you can argue with that.

So what does that mean for you? Lets imagine this: You have worked for several days, maybe even weeks creating the ultimate application in Google Spreadsheets. It is NOT portable, because you used all the excellent services provided in the scripting environment so basically your sheets are useless outside of  this company's environment.

But Google is almighty and great, they will not close down. Oh but they will, sooner or later they will shut down services that are not profitable for them. And then what?

I will tell you what - you re on your own. Currently there is no way to write those scripts portable so that they will work with any other known environment. I have data collection and analysis tool written for Google spreadsheet application spawning 24 files. It is working very well indeed and it is hosted for free for me etc. But what will happen when Google decides that the new - improved by the way - spreadsheet application drops some of the services support. Like they actually did? Can you stay on an older version so all your stuff will continue to work? No! Can you take your application elsewhere. No! Can you do something about it? No.....

So next time you agree to fully rely on a free or paid on-line web application remember this - you have absolutely no control over your data. And even if this is not a problem for you think this: you have absolutely no control over the software. And if this is also not bothering you - you have no control on the availability of this software. It can disappear  in just an instance!

I for one am thinking about it. The problem is this: 6-7 years ago when Google started this "the web is the platform" bullshit it sounded like the end of an era - every one can put their application on-line and reach any user on any device and perform outstandingly business-wise. Yeah... and now every and each one of those excellent performers hold you in a grip firmer than any of those other before them. Yes, I am talking Microsoft. The "evil" they was / are, they would NEVER make you loose your application. You had a choice. Stay with the older version and protect yourself the ways you can (Intranet, firewalls, whatever) or upgrade and pay anew once again becoming master of your applications. Now this choice is gone and you are completely at the mercy of those companies. Think about that. You do not own anything. Your data, your software or its availability. You control nothing and have no rights.

So congratulations to you all on-line service and apps users. You have been schemed out of your digital hold o things. What will happen next? Who knows. But the next time someone starts to praise to me the advantages of the cloud and web apps I might just kick him hard.

декември 27, 2013

Rss for personal use

After Google reader have been shut down in July I migrated first to Feedly and ten to InoReader.

Ino is very good if you can get used to some limitations.

Lately I have been reading more and more on privacy concerns and it turned out that all those on-line services have one or more issues when it comes to your privacy. Basically they can do whatever they want with the statistical data collected based on your usage and interests under one form on another.

I can understand why for some people this is not a big concern, however there are also people who strongly disagree with this policy.

I tend to be neutral on it, but just for the sake of the argument I decided to try and go on the other side and see if I can set up a local solution for RSS reading.

Now, it is clear to me that large percentage of the younger population prefer to get their news pre-filtered by their peers (via facebook, tweeter and g+ for example), but I still have several very different interest and no particular person(s) to count on for providing me enough information on all topics that might interest me, so I use RSS news feeds daily.

Quick look up for free Linux solutions for RSS reading reveal that most of those are console based and are not very useful if your feeds contain lots of media (pictures, embedded audio and/or videos) so I decided to go with liferea.

Now, one interesting aspect of liferea is that it can sync with those on-line services you know already (like Feedly and InoReader). However the objective is to be independent of those.

So what you do is basically export your feed list from the service provider and import it in liferea.

Liferea keeps all its data in .liferea_1.8 in your home directory so it is easy to set this up as a symlink and actually use portable media to store your data and take with you. Note that you should actually use fast flash drive as the low end devices are too slow and will result in bad user experience is used.

I think same could be done for your .firefox folder. Even thou Mozilla says it protects your data and encrypts it, Google definitely does look at your usage. Chrome is a very good browser, but I do not feel comfortable using it for my day to day browsing so I use it primarily for development.

As a developer I like the idea for the open web, but as a business trained mind I can clearly understand that those "free" services has to operate on something and that 99% of the users do not actually pay for the premium features so it is very hard to stay afloat with only free riding users so it is only natural to try and capitalize on the user statistics. This is why I think that most services must be pay-walled. If you pay you have the right to demand conditions. If you do not want to pay just use other personal solutions. Free services should die.

And if this means RSS might die, so be it.

декември 11, 2013

Component goodness (Polymer) - (with update on readiness)

After several months of active development I think Polymer deserves another trial / deep look.

My initial reactions to it was negative due to several assumptions I had:
  • there is no way to compile down to one file
  • there is no 'global compile-time checking'
It seems my first objection is now being addressed and there will be a way to compile down all the imports and links into one single file, which is great. The second one is hard to explain if you are not used to Dart or Closure ways of doing things, but basically they make compile time graph of all your calls and compile an AST of your application and operate on it (checks, asserts, rewrites etc). The thing is that you should not really need that if you use the declarative approach to build your application, which is what Polymer assumes you are trying to do.

So how do we do this?

First go get the code. I used c9.io workspace, but if you have a Linux/MacOSX machine you can do this locally. I tried all the methods: zip archive, git pull and bower - all of them work, but some of the examples need path tweaking to find the files. Also you might need to add both polymer.js and platform.js if non-minified version is used (bower and git).

I find that one of the most interesting tests one can do with a new technology is to try and build something complex that you have already built with another technology. Then compare the result, the experience and the speed of development between the two approaches.

My intent is to use Longa as a product developed with Closure tools and to try to re-create it with Polymer.

Longa is a large product (~2MB unprocessed JS and templates). Compiled with closure compiler it is boiled down to 115KB (JS plus the templates) and is further compressed by the server to 30KB, which makes a reasonable download size for a mobile web application. 

The HTML cannot be compiled down (names matter, its not like JavaScript), thus the savings should be coming from a more compact expression forms. 

For now I have re-created only a single channel record and I could say I am already impressed with what the platform is capable of doing without actually me needing to write any code at all. Of course there is much to be done and even more to be desired.

For example the styling of two shadowed elements do not work as I was expecting (in the context of a single polymer element and the elements are regular ones - a div and an image tag). Maybe it is a bug, maybe I am missing something, but still it is kind of a hurdle for a new comer, regardless of how many tutorials are out there, pretty much all of them concentrate their efforts on the area of isolating the styles from the outside world and not on how they work internally.

One of the most interesting things I notices was the fact that you can bind the styling of an element and mix it with an expression, so an element can calculate style values based on properties and arithmetics. For example:

#selector { height: {{height+20}}px; }

is a valid style inside the template of an element!

Another interesting factor is the abstraction of complex routines into elements. For example ajax calls are hidden in an element and you can listen on that element or any of its parents and use it as a regular element (just like the select or change events in native controls).

All in all it is at least an interesting toy. I am not sure how fast the browsers will get to the point where all this will work without hacks or shims, but at least for a certain class of applications it will be a nice fresh breath out of the JS insanity we have been living with for the last 10 years. 

As a conclusion: even if your code is very complex (for example something like a drawing board or document viewer) you can at least try and ship it as a web component so others can use it in the simplest form possible, by just import it and use it as a tag in their markup. I for one will try that!


Update (12/16/2013): Turns out most of the things do not work in FF/Mobile Safari, especially the styling. Some style rules do not work in Chrome even with no apparent reason. For example rule like this:

padding-left: 4px;

does not work but this one does:

padding: 0 0 0 4px;

I guess there are still lots of bugs and features missing, but clearly if you want to just tap your fingers into the power of the web components now is a suitable time. However if you want to go to full blown large apps you should definitely wait at least a few months!