Here's the problem. It is not very easy to scroll a document when you're inside an input element. Arrow keys don't work, and Page Up/Page Down jump in big increments. What if you want to see just a few lines below the current element? Our clients hate to scroll. And they hate having to use the mouse. This just brings the two together.
FScroll is a JQuery plug-in which makes a page scroll to the currently focussed element, keeping it's position centered with respect to the document. This helps keep a bit of "context" around the currently focussed element - since it is centered, you can see a few elements both above and below the currently focussed element.
Here are the sources. And here's a page explaining it's usage in some detail. And oh, it does nested centering too. But it requires that the 'nesting' container have a css styling of position : relative (in the demo page, the div enclosing the table is positioned relative). This was not strictly necessary, but it made the coding a bit easier. If you can't live with the styling restriction, let me know. I'll try to do what I can.
So the management has finally approved your project, and has asked you to start working on it. Heh ... little do they know that you'd already been working on it, and have a nice prototype working, and it's all saved on your local git repository. But your company is not as cool as you are - it has it's own svn repository, and now you have to import your code into it, history and all.
Here is the git tree, as you have developed it:
.. and your svn repository looks similar to this -
Now you could copy all the files from the git repository into trunk, and commit it. But that is really not the way it should be. For one - no one will know the reason for *anything* in this repository before the big bang. Also, there might have been legitimate reasons for people to branch out from some earlier state of the code, but now no one will even know.
Fortunately, a mail on the kerneltrap archives tells us how we can export a git repository, along with all it's history, into an svn repository.
The --prefix gives you remote tracking branches like "svn/trunk" which is nice because you don't get ambiguous names if you call your local branch just "trunk" then. And -s is a shortcut for the standard trunk/tags/branches layout.
Fetch the initial stuff from svn:
git svn fetch
Now look up the hash of your root commit (should show a single commit):
Now, "gitk" should show svn/trunk as the first commit on which your master branch is based.
Make the graft permanent:
git filter-branch -- ^svn/trunk --all
Drop the graft:
rm .git/info/grafts
gitk should still show svn/trunk in the ancestry of master
Linearize your history on top of trunk:
git svn rebase
And now git svn dcommit -n should tell you that it is going to commit to trunk.
If you check your svn repository log, it will look like this.
All the history, nice and linearised for svn.
Keep in mind though, that this method is lossy. All the branches have been linearised, and you can no longer "check them out" in the original git repository. Apart from that, things work just fine, and you can continue to commit in your local git repository, and push to svn as and when needed.
This is what population growth, and the resulting competition, can do.
~
English is a foreign language to most Indians, and yet it seems to be preferred for dispensing information. It's not uncommon to see a gaffe every now and then.
I thought I'd take a second look at the Mr. P and Mr. S problem, which I'd posted more than a couple of years ago. The last time I tried it, I wasn't successful. I had a strategy to solve it, but somehow I just couldn't translate it into code.
I've been programming a lot with C# lately, and decided to use LINQ to solve the puzzle. Although not very concise, compared to the Python and Haskell solutions out there, it does print out the right answer. After you've tried to solve it yourself, you can have a look at my solution here.
There's something special about LINQ queries. All LINQ queries are deferred, which means that they aren't executed until they are accessed. Also, they are re-executed when the execution context changes. Say we have a list of numbers, and a query on it like so :
var numbers = new List<int>(); var query = from i in numbers select i;
The query hasn't been executed yet. We add a few numbers to the list, and compare the counts of the list and the query.
numbers.Add(0); numbers.Add(1); numbers.Add(2);
// 3 elements in list, 3 in the query Assert.AreEqual(numbers.Count, localDeferredQuery.Count());
The test passes. LINQ queries are "live", very much like functions. Usually, this is a good thing, as no operation is performed until it is actually needed. However, there are exceptions. For example, I used these three ranges -
public static IEnumerable OddRange(int stop) // returns odd numbers upto "stop" { for (int i = 1; i < stop; i+=2) yield return i; }
public static IEnumerable EvenRange(int stop) // returns even numbers upto "stop" { for (int i = 2; i < stop; i+=2) yield return i; }
public static IEnumerable Range(int stop) // returns all numbers upto "stop" { for (int i = 0; i < stop; ++i) yield return i; }
To define the Deferred() and Immediate() functions below:
public void Deferred() { var all = Range(limit); var even = from e in EvenRange(limit) where all.Contains(e) select e; var odd = from o in OddRange(limit) where !even.Contains(o) select o;
var query = from q in odd select q;
foreach(var i in query) { var j = i+1; } }
public void Immediate() { var all = Range(limit); var even = (from e in EvenRange(limit) where all.Contains(e) select e) .ToArray(); var odd = (from o in OddRange(limit) where !even.Contains(o) select o) .ToArray();
var query = (from q in odd select q).ToArray();
foreach(var i in query) { var j = i+1; } }
all, even and odd are three sub queries, each using the previous one. The Immediate() function only differs from Differed() due it's forced execution of the subqueries with ToArray(). However, Immediate() performs much better than Deferred(). I knew LINQ operators are actually euphemism for functions, and that iterator blocks are actually exploded by the compiler into a lot of code. But Deferred() was waaaayy slower than Immediate(), and the time taken would increase exponentially with the value of limit. This couldn't be just some extra code.
I posted a query on stackoverflow, and it did not disappoint. It is quite obvious in hindsight. This statement -
var odd = (from o in OddRange(limit) where !even.Contains(o) select o).ToArray();
in deferred mode, turns out to be pretty expensive indeed. It contains a call to even.Contains(o). While in the immediate mode this is an O(n) operation, in deferred mode, the sequence of calls looks like this -
odd --> even -+-> EvenRange() | +-> all --> Range()
A simple O(n) operation is now O(n3). We can do better than O(n), however, by using a Hashset.
var evenSet = new HashSet(even); var odd = from o in OddRange(limit) where !evenSet.Contains(o) select o; // Contains() is now O(1)
If my journal template hasn't changed since this post, you should see a µBlog roll on the sidebar. If you've clicked on any of the links, you'd now that those notices (or 'dents') come from identi.ca.
identi.ca is a website very similar to twitter, only better. It's built with the open source laconi.ca project, and has tags and groups too. The killer feature for me is IM support, along with a decent command list. All you have to do is add their bot on google talk, and you can send/receive messages in real -time.
the commands currently supported by the IM bot are:
on - turn on notifications off - turn off notifications help - show this help follow <nickname> - subscribe to user leave <nickname> - unsubscribe from user d <nickname> <text> - direct message to user get <nickname> - get last notice from user whois <nickname> - get profile info on user fav <nickname> - add user's last notice as a 'fave' stats - get your stats stop - same as 'off' quit - same as 'off' sub <nickname> - same as 'follow' unsub <nickname> - same as 'leave' last <nickname> - same as 'get'
identi.ca also supports forwarding dents to twitter, so you wont completely alienate your fans on twitter. However, identi.ca doesn't pull tweets, so you wont see any @replies from twitter on identi.ca. At least until you can convince your friends to move from twitter.
identi.ca belongs to a larger ecosystem of OpenMicroBlogging software, which have adapted a common standard so that messages between them may be shared. If you use a software that supports OMB, you wont alienate someone just because they happen to like something different (in contrast, the twitter community belongs only on twitter).
Another popular µBlogging site is jaiku, which will support OMB, and go open source soon. If identi.ca is not your cup of tea, or if you happen to like everything Google, jaiku may be for you.
There's a video about Intentional Software over at MSDN, and it's definitely one of the more brilliant things to have come out in a while. Most of the times, people who best know how a software should function, are not the ones writing it. That is how people who write software for money, make money - they pitch their ability to convert business requirements into "executables". With intentional software, everyone does what they do best.
All the places I've worked at had tools that made it easy for domain experts to give their inputs, since turnaround time is extremely important for business. Even for personal projects, I've toyed with modelingtools and OR mapping frameworks to make my model somewhat independent of the implementation. Never have I seen a tool so comprehensive, though - watching an electrical circuit being modeled as a diagram, along with the impedance and voltages was amazing. This wasn't just the model being abstracted - it was the whole program, editable as text, diagrams or XML, and easily converted into executable code.
Martin Fowler has quite a lot to say about it too, there isn't much I can add to it. Open source junkies like me will look for an open source alternative for this, and will find JetBrains' Meta Programming System. The next few days will be exciting, as I try to evaluate this beast and see whether it fits into any of my current work.
Extension methods are a great way of adding functionality to existing classes. It almost makes C# similar to ruby/javascript, where none of the classes are "closed" - functionality can be added to them at any point of time.
For example, say we want the method ToTitleCase
to be available to all String objects, and we define an extension method.
namespace MyExtensions
{
public static class StringExtensions
{
public static String ToTitleCase(this String word)
{
return word[0].ToString().ToUpper() + word.Substring(1).ToLower();
}
}
}
Now, wherever we want to use the method ToTitleCase, we include the namespace MyExtensions, and the following becomes valid C# code :
However, C# extension methods are simply syntactic sugar. Any extension method calls in source, such as the ones above, are transformed by the compiler into this :
The extension methods for any given type available to the compiler, but they are compiled as regular static function calls. Feels like cheating, I tell you.
Now what happens when we try to invoke an extension method via reflection?
[TestMethod]
public void TestMethodInfoInTargetClass()
{
// We won't find the method in String ...
Assert.IsNull(typeof(String).GetMethod("ToTitleCase"));
}
[TestMethod]
public void TestMethodInfoInDefinigClass()
{
// But we will find it in StringExtensions.
Assert.IsNotNull(typeof(StringExtensions).GetMethod("ToTitleCase"));
}
This should be obvious - we aren't going to find them on the target type - we'll find them in the class where they are defined.
So, how does this affect us?
Say you're browsing through some source code, and you see a call like this - myObject.someMethod(). If you need to call the method someMethod() dynamically, you can't use the type of myObject to reflect it. Instead, you need to know if someMethod() is an extension method, and if it is, you need to reflect it off the class in which it is defined.
That solves the problem when we know which extension method we need to call. If we don't, and we want to know all the extension methods available for a given type, we can use the attribute ExtensionAttribute. This attribute indicates that a method is an extension method, or that a class or assembly contains extension methods. Given this, we can implement a function that returns all extension methods defined for a given type.
IEnumerable <MethodInfo> GetAllExtensionMethods(Type targetType)
{
return
from assembly in AppDomain.CurrentDomain.GetAssemblies()
where assembly.IsDefined(typeof(ExtensionAttribute), false)
from type in assembly.GetTypes()
where type.IsDefined(typeof(ExtensionAttribute), false)
where type.IsSealed && !type.IsGenericType && !type.IsNested
from method in type.GetMethods(BindingFlags.Static | BindingFlags.Public | BindingFlags.NonPublic)
// this filters extension methods
where method.IsDefined(typeof(ExtensionAttribute), false)
where
// is it defined on me?
targetType == method.GetParameters()[0].ParameterType ||
// or on any of my interfaces?
targetType.GetInterfaces().Contains(method.GetParameters()[0].ParameterType)
// or on any of my base types?
targetType.IsSubclassOf(method.GetParameters()[0].ParameterType)
select method;
}
The above method was inspired by Jon Skeet's answer on Stack Overflow. It simply improves on it by detecting for interfaces and base types, and looking in all assemblies in the current AppDomain.
We were a bit late in our plans, but finally Samson gave the go-ahead.
We'll be taking off from the other end of the beach. See you there in five minutes.
The "other" end of the beach was quite some distance away. The wind sock put up to guage the wind velocity was too far to be visible. Barbara and Pascal decided to walk it up, and I took up the risk of pillioning with Joel. We found the wind sock in a couple of minutes - it was barely flying. Not very inspiring, and Joel found it amusing.
Don't worry. If there's not enough wind for these guys to make you fly, I'll pull the three of you with my bike.
Fat chance.
Samson and Mangesh arrived on a bike soon after, with the winch in tow. Now, Joel is among the healthier people around, and Mangesh did not seem too pleased to see him, but tried to put it across very diplomatically ...
We'll need a lighter person for the first ride ... let's not challenge the conditions right away. Around 50 kgs ... ?
Barbara was still a fair distance away, though. I said I was around 73. So long as it was not Joel, it was apparently ok.
I started putting on the gear. Knee-caps and a helmet. I wonder if they could help me survive a 1000 ft fall. Seemed like a moot point. But I was wrong, they would help me soon.
Barbara and Pascal reached soon after. They were both apparently happy to know they weren't the first ones on the ride. And along with Joel, they tried to unsettle me a bit.
Arre pehla bakra mil gaya inko ... tujhse hi poore equipment ki testing hogi.
Mangesh did a quick round of the basics with me. Apparently the gliders are very safe, they are designed to quickly recover from most accidents during flight. Each glider has a glide ratio -
For a glide ratio of 7:1, if you drew a horizontal of 7ft, and drew a vertical of 1ft downwards, you would be flying along the hypotenuse.
He also talked about thermals, wind currents that move upwards due to uneven heating of land. They'd help us gain altitude, but they also bring turbulence. We were on the beach, we weren't expecting a lot of them.
We were almost ready now. Last minute instructions were doled out. Hands down the sides, within all the hanesses. Run straight, and dont stop. I told Mangesh that it would be a bit awkward to run with both hands tied down, but he said it would only last for 5-6 seconds. As the glider went overhead, the harnesses would loosen. Seemed doable. The wind was ok, not too much of it. Too little, maybe. The winch was attached.
73 kg, NIL wind. Over.
Once the conditions were confirmed, we were all set to go. The two men by my side grabbed my arms, and pulled me. I ran, or at least tried to ...
The glider gained a bit of height, and my feet were airborne. The gilder came down again, however, and my feet didn't find the ground. I lost my balance, and ...
(somehow, I have a feeling that this will be the most watched video in this post ...)
As we fell, Mangesh let the release latch go. The winch separated, and took off with a jerk. Joel didn't understand what happened, and was a bit concerned.
Arre rassi kaise nikal gaya?
I got up, and dusted myself. Barbara had commented earlier in the day that I still liked to play in the mud. I got a whole lot of it on myself now. One of the knee caps was left behind, a fair distance behind. You never need these safety precautions, until you do.
We had to try again. The glider was set up behind us, the winch re-attached. We got set again. I was told that it was all up to me - the pilot couldn't do squat if I wouldn't pick up speed. As the winch started pulling, so did these guys, and they shouted -
Run, run, RUN!!!
And run I did.
The whole experience was surreal - I hadn't felt anything like it before. As the winch pulled on us, and we went higher and higher, I couldn't help feeling that I was reaching the point of no return - that a fall from this height would be fatal. But those thoughts subsided in a few seconds - the glider was extremely comfortable. Mangesh was pretty cool, though - he must've done this million times. He knew the answer, yet he asked -
Wonderful, isn't it?
And yes it was. Pascal took along his camera with him when he went up (the third flight of the day, after me and Barbara), and shot this amazing video.
It was wonderful up there. Mangesh kept talking, about the glider, about the winds, and the scene below, but frankly I dont remember much of it. I was busy enjoying the view.
We really have Samson, and the Space Apple club, to thank for this wonderful experience. If you want to know more about such events in the Vasai-Virar region, you can visit their website.
There are also a few photos of the day on my picasa album.
Of the many great things that Firefox has brought us, tabbed-mode is the earliest, and arguably the greatest. Tabs existed in applications before Firefox, but I believe it was Firefox that made them mainstream. Soon, many applications started touting tabbed windows as one of their main features. Tabs were a good way to keep all "related" windows together - finding the window you needed was never easier/faster.
But now, tabs only seem to hinder my workflow. During the past few years, Firefox has grown, and so have I. However, Firefox still focusses on one task - web browsing, and I have moved on to doing things that involve more applications. When I am developing, I usually have two instances of my IDE, API documentation in Firefox, IM windows in Pidgin, and git terminals. Most of the times, I cannot reach the window I want to in a single attempt. Going back to a page I am browsing from my IDE involves Alt-Tabbing to my browser, and then Ctrl+Tabbing to the correct tab.
I seem to have found a solution to my window switching woes on Windows, although I needed two applications to get what I want. The first - TaskSwitchXP (use compatibility mode in Vista). All my applications now run in non-tabbed mode (here's an extension for Firefox to help with this). When I need to switch to between windows of all applications, I Alt+Tab to get this -
However, when I need to switch between instances of the same application, I Ctrl+Alt+Tab my way to this -
Notice that the taskbar shows many application windows open, however, the switcher only shows Firefox windows. This is very similar to Ctrl+Tabbing within an application.
Multiple windows can be a pain though, once you have too many of them. Not very uncommon while browsing. If the window is too far back on the Alt+Tab MRU list, I use the second application - this autohotkey script - to do an incremental search-as-you-type in the existing windows list. Like so -
I'm still looking for something like this on Linux - an application instance switcher, and an incremental search for all windows. The problem is somewhat mitigated through workspaces, but not completely solved, IMHO. You can group related windows on a workspace, and the window switching you have to do gets a lot lesser. But the tabbed applications still break this flow, and you can never get to the window you want directly.
Mac OS X apparently already has a shortcut to this effect.
Some browsing, debugging, and IRC chats later, I have managed to set up a git repository on Windows Vista using cygwin, with a few unexpected hiccups. I will try to repeat the process on a virgin setup to come up with a more authoritative flowchart of how to go about things. For now, I'll just list down the issues I faced.
I used gitosis to host git repositories over ssh. It's pretty elegant, really. Administering gitosis is limited to managing a configuration file and user keys, which itself themselves are in a gitosis hosted git repository. Neat.
cover everything you need to do to host git repos. The first link should be straightforward. However, since the second link is for linux users, there are a few deviations for windows. Try to follow the link, if you face difficulties, refer to the tips below.Installing gitosis
Log in to an administrator account. Do a
cd ~/src git clone git://eagain.net/gitosis.git cd gitosis python setup.py install
If the last command fails thusly
Traceback (most recent call last): File "setup.py", line 2, in ? from setuptools import setup, find_packages ImportError: No module named setuptools
you may need setuptools from here. Scroll to the bottom, download an egg, and do
sh setuptools-0.6c9-py2.5.egg
Repeat the last step. After you have installed gitosis successfully, run the following command
chmod +r /usr/lib/python2.5/ -R
This was the first wtf. The downloaded egg we just installed gets installed with administrator ACLs, the above command ensures that everyone has access to the downloaded eggs.Setting up gitosis
You need to add the git repository user `git' the windows way (the adduser command in the second link will not work). Once you've done that, make sure you've run the following commands
# in the new 'git' user's account ssh-user-config
# from the admin account # (domain users need to add '-d domain_name' to the mk* commands) mkpasswd.exe -l > /etc/passwd mkgroup.exe -l > /etc/group
Of course, this assumes that you've copied your key to the /tmp folder. The rest of the write-up should be fine, except for one thing. When you try to `git commit' you repo configuration, you may see an error like :
and you should be good to go.This should do it. If you still run into problems, let me know. If you find solutions to them, post them up so that I may link to you. And when you do have a git repository up, give me the url to fork from :)
Using a combination of gmail address aliases and canned responses, you can use gmail to automatically send directed, and relevant responses on your behalf.
You have to enable canned responses ...
... add a canned response while composing ...
and set up an appropriate filter.
As you can see in the pictures above, I have set up gmail to respond to any mails to sandesh247+ci@gmail.com with my contact information. I could have a public web page that does the same thing, yes, but in this way I reveal information only to those who explicitly ask for it, plus I have an account of all the requests.
You have to be careful though, this does set your account up for spamming. An automated reply immediately validates an email id, make sure you do this only with email addresses you want to make public.
While the above is funny, using a good editor, and using it effectively, usually prevents such situations. A good auto-completing editor isn't necessarily to help one type faster - it is most helpful for the context sensitive documentation that it provides. It brings documentation closer to the act of writing code, and saves the context switch introduced due to Alt+Tab, F1 or any other documentation invoking key-binding.
Like so -
If you haven't found out how to make your editor do this, you probably should.
`Emosanal Atyachar' has to be the most brilliant song in a while. While initially it sounded almost repulsive, it now makes me laugh every time I listen to it. The devil is, as they say, in the details.
Bol bol why did you ditch me,
zindagi bhi lele yaar kill me,
bol bol why did you ditch me whore...