Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.

2011/12/05

Questions about Git

I'm reading Eric Sink's Version Control By Example, starting to hit the examples, and I'm finding that to be a bit of a problem.

Eric's example is one dev in Birmingham, UK and one in Birmingham, AL, writing C code and committing to a server in Cleveland. Right now, the dev team for my office are two coders separated by all of five feet, writing Perl code for one of two servers, one being the web server. I'm easing in to git, as you might guess by the reading material, and like most working environments I've been in, our version control system has been "copy a backup before you muck with the file", which I know is dumb and useless (especially in the lab, where we rely on RAID for data protection and thus don't really have backups). I'm kinda taking the lead on this, having been burned enough to want to protect myself, but I don't really know much of it.

I've been using git so far to keep track of changes locally, just doing git init and git commit within the local file system. Is this enough? Or do we really, truly need a server?

And the code sits in /home/user/web/cgi-bin/foo/bar/blee/quuz.cgi or /home/user/bin/hoge.pl, I'm wondering: should I have those be the directories to run git in? Or should I do the work in /home/varlogrant/dev/hoge.pl/ and copy hoge.pl over to /home/usr/bin when I'm happy with it and it has been committed?

I have a bigger question on how to take a large selection of interconnected Perl modules and make them 1) test driven in the real, chromatic-approved way, 2) working with git in a useful way, and 3) usable from ~/lib on several dissimilar systems. I have a cheap hack on #3, but if I break apart and reassemble the modules, I can probably do it cleaner and smarter.