Friday, December 07, 2007
Anyone have their own pet program? (Does it do regexp? Does it support binary-format scheme files that you can attempt to apply to the data?)
Thursday, November 29, 2007
I pointed it out to Miller, and we both agreed that it’s nice to see that some thematic elements that make a Mac a Mac are retained even to this day.
So, now that I’ve wiped off the crud that had grown(?) on it while in storage, is it worth trying to find someone to fix the hard drive? Or is it just something that I should replace? Anyone have an ethernet to old skool Appletalk solution and/or is it worth it to try and hook it up to the internet again?
Tomorrow, I get to see if my Duo 280 will boot in its docking station.
Having looked at first this post, and then later this hint, I decided to bring back multiple boot partitions. I tried using the disk utility mechanism to resize the volumes to no avail. I think it always corrupted the MBR that the bootcamp utility set up. I ended up buying iPartition and playing around with it a bunch, but most of my edits would cause Windows to no longer boot either. However, I did manage to come up with an order of operations that makes it work.
- Boot from your Leopard disk into its copy of Disk Utility.
- Format the disk as HFS+Journaled entirely
- Install Leopard.
- From Leopard, run Bootcamp.
- Resize the partition that Bootcamp suggests so that the Leopard partition is the size you ultimately want it to be, and that Windows takes up the rest.
- At this point, instead of rebooting and installing Windows, I booted onto a external USB drive with iPartition installed. (You could boot onto the iPartition boot CD instead.)
- Using iPartition, I shrank the Windows partition to its ultimate size, and added after it two extra HFS+J partitions for my other OS partition and my data partition.
- I put in the Windows Vista RTM DVD and rebooted holding down “C” to install Vista. It saw the shrunken partition, let me reformat it, and installed.
[These instructions are specifically for starting from scratch. If you’re trying to do this with pre-existing volumes, I suggest backing them all up to an external device using Disk Utility or better yet, Carbon Copy Cloner, and WinClone for bootcamp partitions. You should then be able to restore after the partitions have been finally resized. You may have to still boot from a Windows disk if only to format the bootcamp partition as NTFS (and not bother installing further) before WinClone can restore your backup.]
The only thing that is somewhat frustrating about this now is that, even though I can boot between Tiger, Leopard, and Vista, as far as Vista is concerned, the drive (disk0) is a MBR drive and has three main partitions, the EFI partition, the Leopard partition, and the Vista partition; the remaining space is “unused”. Disk Utility while booted into Mac OS X, on th e other hand, happily lists the other partitions. This isn’t usually annoying, but MacDrive only sees and can mount the Leopard partition. (Mediafour claims the the partition maps “are incorrect or damaged beyond MacDrive’s ability to handle.”) So if I wanted to still have an available-to-Windows data drive, I’m going to have to back up everything, and restart these instructions, only making the original Leopard partition large enough to accommodate the data partition, and move the ultimately Leopard partition to the end. *sigh*
If you all have better options, please let me know.
P.S., Since all this, yet another post describes how to do this, and add linux to the mix.
Wednesday, November 28, 2007
There is no technical limitation of CoreCLR that would not allow calls from it into Cocoa (ObjC[++]) or vice versa, in the same way that Cocoa can call into C/C++ normally. The CoreCLR, like the desktop runtime, supports a hosting interface that allows .NET to be hosted in an application environment. Unlike the desktop runtime, CoreCLR is currently only ever hosted (e.g., in the browser control called Silverlight). You can ask the hosting interface to create a function delegate, which will take a managed function and turn it into a C-style function pointer, which, were you to call it, would marshal all the arguments into the managed world and run managed code. Beyond that, you could theorize creating something like the ObjC-Perl bridge, where managed objects were made visible directly to the ObjC runtime1.
That said, there are a couple of things that would stymie the average developer if they wanted to do this:
- At the moment, only internal (i.e., Microsoft) clients of CoreCLR have access to the hosting interface.
- The security model of CoreCLR, at least in the Silverlight timeframe, is changing so that only Microsoft trusted libraries have access to sensitive OS operations, and normal developers’ code would be sandboxed (much like it would be if it were running in Silverlight).
I don’t have too much insight as to whether these things might change in the future. However, there are a bunch of details that would have to be resolved first, e.g., how to ensure 3rd party CoreCLR users keep their CoreCLR serviced with the appropriate security fixes. If you’re interested, let your request be known in the feedback forums up on http://silverlight.net.
1Although, at this point, you’d end up with double garbage collection. If an ObjC object held a reference to a managed object, and then lost its last reference, then eventually the pool would get collected, which would then release the (possibly) last reference to the managed object, which would then get collected when the CLR GC occurs.
I joined the CLR team in August of ’06, and the work to “trim” the desktop runtime engine and frameworks down to the for-use-in-Silverlight CoreCLR runtime engine and frameworks had already been completed. Back in June(!)1 I talked with Jonathan Keljo, who was the program manager working on this problem, about what went on during this period.
Ultimately, we cribbed. Instead of trying to figure out for our (CLR) selves, we ended up looking at the pre-existing product that was already a slimmed-down version of .NET: the Compact Framework2. We looked at their surface area and decided to see if we could match theirs. They had foregone features, and we chose similarly. One of the many ways CF manages size changes is just limiting the number of convenience functions.
When we prototyped the CoreCLR surface area, we ripped out some stuff based on instinct–what were the biggest (in terms of data and code size) features? We took out the top few (fusion, COM Interop, server GC, debugging support3), and strangely enough, we were close to our size goals.
On the framework side, we had inherited some subsetting already from the Rotor project. Since the platform adaptation layer (PAL) did not support all of the APIs we were calling in Win32 in the desktop version of the CLR to support some managed libraries, some of those managed layers were removed. In certain cases, we expanded upon the pre-existing PAL. The PAL had been written to be simple and very cross-platform, but knowing we wanted to support the Macintosh, we could supplement it with Macintosh-specific implementations of certain important APIs so as to not have to overly subset.
It’s not the cool-use-of-advanced-software-refactoring-principles answer I might have wanted, but it’s reflective of the environment we often find ourselves in–partially constrained by our desire to leverage previous work, and partially enabled by that same desire.
1Yes, I know I’m behind.
2I knew this was going to be a cop-out answer, which is why I waited for so long to post. Perhaps I’ll get a chance to interview someone from CF and figure out how they went about their trimming procedures.
3Debugging support isn’t so much removed from CoreCLR as split out from the main product so it can be downloaded separately as an SDK.
Friday, November 02, 2007
After getting back to work, I made it a point to get out from under the heap of mail I’d accumulated over the course of several years before starting any new projects. Some 8,000 semi-read Inbox e-mails later, I now only have a 400 message “reviewed” folder, which will probably have to have another categorizing pass made on them1. In some cases, it was a matter of realizing I was no longer (if in fact I ever was) in a position to affect some issues, and even though they may have been personally irritating, that I had to pick my battles. I still think I’m still going to have to figure out a better way to track conversations to ensure that things get handled correctly.
Among the pending things to do is to respond to some questions posed by David Weiss, and so I think I will do that now and have one less thing on my list…
P.S., Mabry has just turned three months and has been vocalizing at me as I’ve been writing this. Maybe now might be just after I play with her a bit. ☺
1During this purge, I think I stumbled on to some what must be n2 or longer algorithms in Entourage, as deleting large swaths of mail when visible in the UI would sometimes take ages.
Here I am, a guy who hasn’t been without a personal Mac since ‘90 (and had regular access to them as early as ‘87), a proponent of Mac software at Microsoft since joining in ‘95, and through my own development efforts, making the Microsoft Office experience better on the Mac, and my wife is choosing Windows because of limitations of our Mac software1.
Admittedly, Miller’s particular requirement is one that is not shared by a large percentage of the users or potential users of Mac Office2–she needs to collaborate with her students, fellow TAs, and professors on electronic documents written in Arabic. This is by no means a new requirement for her. She began her Arabic study four years back as a requirement for her masters degree in Comparative Religion, and decided to parlay her studies into a doctoral program in Arabic Studies.
The problem? Most all work is done in Microsoft Word documents, and Microsoft Word for the Macintosh does not support any right-to-left languages (aka, “bi-di”). If she works in Mac Word in Arabic, she gets completely disconnected characters3. She has unseated me many, many times at my Windows box in our office so that she can use Windows Word and actually get work done.
Having worked in the MacBU as a fellow developer, I had a great “in” to try and get this addressed. Unfortunately, every time I would bring it up, we’d do the back-of-the-envelope calculation of development cost4 versus the number of users who would use it (upgrade incentive for current users, and new users buying in due the feature). The internal statistics for Mac adoption in Arabic-using (and, to a lesser extent, Hebrew-using) nations did not make for a pretty picture for the “leverage” this feature would provide–our development dollars would probably be better spent on other features that had a impact * user-base value.
Of course, I’ve made alternate suggestions based on the software she does have on her Mac. In 10.4, at least, Pages had issues loading/working on Word documents with Arabic. (I know not whether creating documents from scratch work better; I suspect so, since even SimpleText, er, TextEdit does a fair job, AFAICT.)
The next question is, of course, why use Word documents then if another doc format has better Mac OS X (and theoretically cross-platform) support for Arabic? Answer: Other document types don’t don’t have ubiquitous, well-known editors. In Miller’s specific case, all U.W. students have access to Word for both platforms, and the labs have more than enough Windows boxes. Most Arabic students/professors don’t have this problem because they don’t use Macs. (Nor is this really an incentive to do so.) They do not have any reason to switch document formats for the (apparently-)minority class of Mac users.
I suppose, though, the issue isn’t completely closed. It’s possible someone will comment that we need to try some specific software to solve the problem. OTOH, it still may be that Miller may end up with a Macintosh computer for her next machine, but if so, it’ll probably be BootCamped to some form of Windows.
Updated: I wanted to show some examples, but not knowing Arabic, I needed some assistance. So, the Arabic word romanized as “mumkin” (meaning “possible”), looks like in TextEdit, but like in Word. Thanks to Miller for helping me out with these.
1Arguing protectionism here would be a little silly. Windows Word gets a fair amount of this functionality from Windows, so if Microsoft Office was a 3rd party, the Word team would have been in a similar pickle on Mac OS X, at least until relatively recently.
2And if you have hard data otherwise, please, please, please let me know so I can convince the Powers-That-Be that there’s a business case to addressing the problem.
3Arabic writing is like cursive in that characters look different if they begin a word, are in the middle of a word, or are at the end of the word. However, unlike cursive, they are seriously different, and the word becomes very hard to recognize, not to mention the layout problems that are caused (because they no longer are the same width).
4The cost has changed significantly over the course of years. Seven years ago, our best bet would have been to port the entire Windows support for ligatures (which Windows Word uses without having to implement itself), and bi-di support from Word. That would have been quite expensive. Nowadays, despite not being able to just replace the Word layout engine with ATSUI (so that we could continue to guarantee identical layout across versions/OSes), we could theoretically offload some of the work to ATSUI and then ask it what work it did and translate that into our own layout world. This is obviously less expensive, but still involved and prone to being a bug farm.
Monday, July 30, 2007
Miller and I (officially) became parents over the weekend. Mabry Parker Herring was born Saturday (7/28) at 6:28pm PDT. She was 6 lbs., 10 oz., and 20” long. She and her mother are both sleeping while I’m taking night duty, and stealing a little computer time. There’s a photoset up on Flickr.
Photo provided by Rob
Saturday, July 07, 2007
Another number 2, and I’m back at home, and somewhere I get the desire to go jogging. (Where did THAT come from?) Made it all of five blocks before switching back to walking and headed home. While our housemates were hosting their birthing class reunion, I finally got the hammock my parents had given me many years ago for my birthday (when I lived in a condo with nowhere to put it) installed in our back yard. On my way back in, I pilfered a little brie from the remnants of the reunion treats, and now I’m off to take Michael’s folding bike, which Miller borrowed on my behalf, to 2020 Cycle to get patched, and I’m thinking a little iced americano from Katy’s will go well with the remaining taste of brie. Mmm… good day so far.
Monday, July 02, 2007
First, an annoying detail I hadn’t noticed about the drive enclosure–it requires two USB connections: one for power, and one for data (and probably more power). This makes it a little more unwieldy, and furthermore, since I often use a USB-based laptop mouse, problematic. However, I generally live without external storage, so I’m not too put off.
The actual hardware transferring process was actually pretty much a breeze. I’d dug into my TiBook before, but the MBP is new, and so it was nifty to get in and see how it was put together. I have to say that the side, back and sunken bottom screws seem to be a much better system for maintaining a secure case. Kudos to those Apple hardware guys. My only two problems here were that I didn’t know exactly how much force it would take to detach the case top from the clips (and was probably overly gentle and thus not getting anywhere for several minutes), and that I was missing one of the rubber grommets that hold the hard drive in place. (In the latter case, it is entirely possible that it jumped ship as I removed the old drive from the case, but a search of my workspace did not reveal it.)
The rest of the process has been a bit of a drag though. The first concern was converting from a BootCamp-enabled setup (a BootCamp-created partition and then everything else on one HFS+ partition) over to a different layout more inline with my historical preferences: at least two HFS+ boot partitions (>= 20GB) to host the current released OS X and one for a pre-release OS X, one large data partition to host data. I ended up taking that template and adding a small case-sensitive partition for some testing purposes, and room enough to maintain the previous BootCamp-created partition. After some small amount of figuring, I tell Disk Utility to do its magic, and I’m ready to start transferring.
Even though I only had the one big HFS+ partition for Mac OS X things, I had still maintained some segregation left over from the TiBook days. So I moved everything in my “Stuff” directory and my source enlistments to the new data drive. This ended up dying in the middle because, despite the fact that Microsoft Entourage was shut down, reminders were still turned on, which meant that the background Database Daemon kept a lock on my identity’s database. Not quite thinking correctly, I re-dragged the Microsoft User Data folder over to the new partition. See, it had succeeded moving some items, but not all items, and as such, I blew away other parts of my identity, including my rules1. I had been moving rather than copying, because I would need to trim down the original drive because it wouldn’t fit on a 25GB partition. Drat. Well, the rest copied successfully, and then I was able to Carbon Copy Clone the original OS disk over to the new, smaller partition. This too was problematic, in comparision to older times–having the two boot partitions means that you can boot off of one to transfer the other, but I ended up having to use target disk mode and my mostly-unused G5 in my office to accomplish the same thing. Once there, I made happy symlinks in my home directory to the Documents, Movies, Pictures (etc.) folders on the data partition, and the Mac-side was good to go.
This is where I started running afoul of missing support for moving BootCamp partitions around. Disk Utility wouldn’t touch the thing2. I downloaded WinClone, but ran into this problem. At this point, I gave up on Windows for the moment, knowing I could always pull it off the external drive later.
Now it was onto installing Leopard in the space lovingly devoted to it. I rebooted into the installer DVD and it complained it couldn’t install on a drive formatted in this fashion. “Argh!” A quick inspection in Disk Utility showed that it was, in fact, partitioned using Apple Partion Table rather than GUID Partition Table. Then, of course, I slapped my head, since I remember having noticed that when I looked at the completely clean new drive, but apparently when I repartitioned it, I neglected to switch which partition scheme I used.
After much grumbling, I used Disk Utility to image the two data-containing partitions back onto the external drive, re-repartitioned using the right partition scheme this time, and spent quality time trying to get the images back onto the drives. In the case of the OS partition, for some reason, both Carbon Copy Cloner and Disk Utility balked many times in a row at the restore, and the first restore that took had for some reason decided to ignore the permissions on the disk image, resulting in a non-functional OS. (It would drop into some boot-time Unix console, try several times to load some services, and ultimately fail.) This morning, I am finally back to the point where I can install Leopard.
I know that this is not a typical procedure for non-technical people, but I really do wish there were an tool, possibly part of the Migration tool, or maybe the Disk Utility (or, obviously, a third-party solution), that let you batch move/resize partitions so that I could do this in one stop. It would be especially cool if it allowed me to split an existing disk into two separate disks by which sets of the file system I wanted transferred to which sides.
1I think the rules regeneration count is now up to eight. At least this time, it was my fault, and not some obscure un-diagnosable rules corruption.
2On the other hand, the Startup Disk would still recognize the drive as a bootable Windows partition, though if you selected it and tried to boot it now that it was a USB-based drive, it would bring up the flashing I’m-missing-a-system icon at boot.
Thursday, June 28, 2007
Yee pictures show off graphically how various election mechanisms–e.g., plurality and instant runoff voting, as well as a collection of other esoteric methods–actually respond to the will of the populace. It’s interesting to see what happens in the case of ties and when the vote is split for several mechanisms (see the result for plurality above). It’s a good reminder about how problematic plurality is, but it’s also a good wake-up call for IRV supporters who think that ties won’t be a problem.
Tuesday, June 12, 2007
Welcome to a new age of .NET. With the Customer Technology Preview (CTP) of Silverlight™ 1.1, the first commercial cross-platform Microsoft® .NET™ components became available. Furthermore, my tongue is no longer officially tied. The cat is out of the bag, and thus I can now clue you all in on what I’ve been doing in the last year.
Since there’s already been quite a bit of posting on the subject of Silverlight, I figured I’d fill in some of the details about its composition and some of the nomenclature, since, even within Microsoft, it’s not completely obvious what’s in the package at first glance.
|WPF/E||Jolt1||Windows Presentation Foundation / Everywhere is the browser plug-in variant of its bigger brother in Windows. It is XAML-based presentation engine.|
|Windows Media codecs||These allow for high speed video streaming inside a Silverlight solution.|
|CoreCLR||Telesto2||The core execution engine and platform adaptation components of .NET. These are the only parts of the managed support that are native; the rest are all managed libraries. Officially, the CoreCLR project is also responsible for the Base Class Libraries (BCL), which has been reduced for both size and portability reasons.|
|UIFX||The standard framework libraries, similarly reduced from the Desktop for both size and portability reasons.|
|.NET Language Runtimes||Managed libraries to support C# and VB.NET|
|DLR||Dynamic Language Runtime support libraries|
The “level-1” components of Silverlight are entirely to provide a managed programming model for WPF/E, as an alternative to its built-in JScript model. The part of Silverlight I’ve been working on is the CoreCLR component, focusing on Mac OS X integration, but have been banding together with my Mac counterparts on the other related teams (mostly WPF/E, with a little bit of UIFX) to make sure the collective Silverlight story on the Mac is a good one. If you have feedback for us, head on over to the Silverlight.net forums, specifically the Programming with .NET forum, and let us know. It may take more than one of us to give a full answer (for example, I know much more about how we handle native exceptions on the Mac than about what XAML you’d need to use for a particular operation), but we’ll help if we can.
1In the era before marketing took over code-name generation, we could have tributes to fine beverage products. However, since code names regularly leak out of Microsoft, and out of worry that we will get stuck with a trademark lawsuit for something that isn’t the name we’re ultimately going to use, marketing now comes up with names, presumably with some kind of trademark search ahead of time to verify that we’re not going to run afoul of someone else’s.
2No, “Silverlight” isn’t the “the gleam of yonder (Telesto) moon,” but I figured it sounded good.
Monday, June 11, 2007
Plz Use namespace SystemI don’t know if I have the heart to tell him that his syntax doesn’t appear to match the only known LOLCode.net implementation. Further alas that it’s GPL’d, which means that per Microsoft’s Chinese-wall-like policy, if I desire to keep my job, I can’t even look at the implementation. Ah, well. I guess, IM NOT IN YR CODZ.
Thursday, May 24, 2007
After code reviewing some large number of files, I took a slight detour over to BoingBoing to an article on an alternative CAPTCHA mechanism. Instead of using a normal CAPTCHA system, typing in known words that have been graphically altered so that a computer couldn’t interpret it, you type in unknown words gathered via OCR that no computer has interpreted, but that the reCAPTCHA system has a pretty high liklihood index on knowing what at least one of them is. You type in two words. You get the “known” one right, and you get in, and your submission for the other word is kept to help determine what the other word is. It’s a pretty interesting idea. Plus, they’re using it to help digitize documents in the public domain that have been scanned, but that OCR has been hard pressed to convert them–a good cause.
In my playing around with the example on the site, I noticed some oddities:
- You can fudge at least one of the words. There was an instance of “1980” and another word. I guessed that the other word was the “known” word, typed that in verbatim, and passed in “1988” instead. The reCAPTCHA system said I was in like Flynn.
- You are likely to get partial or multiple words instead. Since it’s OCR that’s already known to not interpret it correctly, it may not even get the correct word break and when you are instructed to enter “two words,” you may have to enter in three or four words, or partial words.
- Sometimes there are odd symbols in the older texts. There was a word that had an ‘æ’ in it. I dutifully asked the International menu to show me my Keyboard Viewer (since I never remember the Option-key combination for those diareses), found the character, and submitted the word exactly as written. It called me correct, but I wager that normal people won’t bother to use those high-falutin’, old-fangled characters, and just type “ae”. It might be close enough, but is it right?
- Use of alternative symbols might get you an error. There was a word that clearly had a right-side closing single quote character used as a possesive or a contraction. However, I dutifully entered in the curley-quote version and was rebuffed! I guess the straight quotes is what they wanted.
- Older texts have spelling mistakes or perhaps older or alternative spellings. I wonder how many people are going to bother making sure they type that extra ‘t’ that doesn’t belong; will it be enough for the verification code to preserve the original text’s spelling?
- Some of those OCR documents are hard to read as it is, and blurring it makes it harder. There were several times where I had to refresh the reCAPTCHA many times in a row because it was just impossible to read. I wonder what their mechanism is when many, many people have avoided making a claim, or made irreconcilable claims as to what that word is. Does some poor shmuck have to go back and look at it manually? Or do they start limiting the blurring? Hmm…
All in all, though, it still seemed like it provided a very reasonable security while providing a nifty public service.
Tuesday, February 06, 2007
When I first saw these, I felt comforted. I was used to seeing the same on my Mac OS X boxes when an application was requesting admin or root privileges. However, the sheer number of times that I needed to actually acquire administrative access to my machine surprised me. I would often think, “Does this thing really need administrative privileges to work?” In a lot of cases, the answer was, “Yes,” but in some, it seems that some more work could be done to either allow an application to install/run with user privileges (e.g., browser plugins) or to isolate preference options according to whether they need the dreaded “elevation”2. Ultimately, though, that work can be done, and application developers and IT folks have some Microsoft-provided tools that can help them make their applications/plugins/whatever behave well in this environment.
Later, I would increase my Good Dogfooder™ karma (or reduce it, depending on whether you subscribe to the only-negative-karma idea) by enrolling in a pilot program to try out redirecting user folders to a network share. It pretty much works as advertised. Everything that would be in
C:\Users\nathanh.000gets redirected to a network volume that my domain account has read/write access to. It takes a bit of time to log in, but no matter where I log in, I have the same information. Similarly, there a few applications and products that don’t handle this gracefully... applications that like to install shortcuts on the Desktop, for instance, are annoying unless they’re installed somewhere on my user directory (which is rarely, if ever) or installed in the same location on every machine I log in to. Again, I can report back my woes and, at least with Microsoft-written applications, have a shot at having them work better in that environment.
Alas, as is usually the case with living on the bleeding edge, there’s the blood. Dogfooding at Microsoft is de rigeur, and there will always be some annoyances, though the product teams do a lot to keep them to a minimum. My problem is the confluence of elevation in the presence of the network home folder.
Let’s review: My domain account is my main account, and is a standard user, not an administrator. My administrative account is a local machine account, and has no domain privileges at all. My home directory is a mounted network drive where my domain account has privileges, but not my local machine administrative account. Do you see what’s coming?
Situation #1: Installing from a network share.
I can browse to some UNC path from my domain account, and click on an installer. The installer, UAC-aware as it is, puts up the elevation dialog, since it has to install into some machine-wide areas. I enter in my local administrator account and password. Voilà, the installer now runs as if it is the local administrator account, and instantly no longer has access to the network share, since my local administrator account doesn’t have privileges. Fortunately, this is usually only an annoyance, as most of the time, I am presented with a dialog complaining that the installer doesn’t have access to the share, and could I please enter in credentials of someone who does. I find it odd that I have to enter in my own credentials when I’m already logged in as myself.
Situation #2: Installing into the home directory
In the case where an installer needs to install into some machine-wide area as well as the home directory, I’m more stuck. I have to elevate to run the installer, but as soon as I do that, the installer loses access to the home directory. For some reason, in that case, it doesn’t present the network-access dialog, and things just fail to work.
In one such case, a program installed most of its items, but then failed to create a startup shortcut for me. “No matter,” I thought, “I’ll just copy it over myself.” I noticed that the shortcut I needed was already in my Start menu, so I browsed to it and copied it. I then browsed to the All Users startup items folder and tried to paste it as a shortcut. I then get the following dialog:
Windows cannot create a shortcut here. Do you want a shortcut to be placed on the Desktop instead? [Yes] [No]
“Hunh,” I think, and choose, “Yes.” Personally, I thought it should have just brought up an elevation dialog and let me paste the shortcut, but no matter. There is more than one way to do it. So, I then try to drag the shortcut nicely pasted onto the Desktop to the All Users Startup Items folder. Happily, I get the elevation notice. I think, “Great! Now I’ll have access to put this item here.” But I had not thought it through, for while I gained access to put the shortcut there, I simultaneously lost access to its source. Thus, after another elevation dialog (I don’t think that one should be necessary), I get the following dialog:
You don't have permission to copy files to this location over the network. You can copy files to the Documents folder and then move them to this location.
I take issue with this dialog on two accounts. (1) I have permission to copy files to this location; I just got it. I just lost permission to copy files from the source location. “to” is a little unclear. (2) Copying files to the Documents folder is not going to help me, since my Documents folder is also on the network share location that my Desktop is.
The sad part is I could have elevated, and copied the shortcut directly, without ever having to place a copy on my network home directory. But I wasn’t thinking through the ramifications.
Unfortunately, neither will our users. Furthermore, they’re not going to understand why these things don’t work “right” or “just work.”
I am not part of the UAC team and have definitely not considered all the ramifications of the implementation of extreme credential isolation, but I am certainly enjoying the authorization mechanism on Mac OS X which allows me as my current user (with all the access that implies) to do specific other tasks requiring a higher authorization level by supplying administrative credentials, and still have access to the secured resources my user should have access to.
I expect the UAC edges will get smoothed out, as will network home folder redirection... I mean, hey, why else do we do this dogfooding anyway?
1Actually, it was at that point that all Microsoft employees were exhorted to all use Vista as our desktops–even though no official effort had been made to ensure that our group’s build environment and tools would still work–so that we would make sure the experience would be good for our customers. Our woes were noted, and some were addressed in time for Vista and associated tools to be released to manufacturing, so our pain was not in vain.
2Originally, I thought “elevation” was just a synonym for authorization. Alas, it is not. When you “elevate,” you (or rather your process) becomes owned by another user with more privileges than you. That user may also have fewer privileges than you in other realms. This can be a problem.
Wednesday, January 10, 2007
I would expect there’s a coincidence of collectors among people who write code, largely because I believe that both are ventures that jibe well with people who are “detail-oriented,” which is my polite way of saying obsessive-compulsive. In my case, I’ve gone through several iterations of collecting, starting with an aborted attempt at stamp collecting back when I was maybe 10. To some extent, it seemed to me that collecting was just a specific form of being a careful pack ratyou take care of your belongings to help them to last, even if you don’t need them at the moment, just because someday you mighta last vestige of handed-down Depression-Era values.
I switched over to collecting coins, but not expensive ones just some interesting U.S. coins I had come across. One of my favorite finds was a glass jar of pennies that my father had saved when he was a kid. I was so excited to find some “seriously” older coins (e.g., from the ‘50s rather than the ‘70s or ‘80s).
I have since taken to dumping my pocket change into jars: one empty Glenfiddich metal canister for the quarters, aka laundry money, and one empty plastic jug that formerly contained olive oil for all the rest. Maybe someday my kids might find it as exciting, if these containers survive that long. The first attempt at this involved storage in an empty poster tube, and during the Nisqually quake, the tube herniated and broke; my wife (then girlfriend) decided to “clean up the mess” by taking all the coins to Coinstar! Coinstar of all things! Not only did I not even get to see six years worth of savings churn through the machine, but I get to pay 8.9% for the privilege! AIEE!
Ultimately, I didn’t do much in the way of collecting coins as a kid. I just kept a few dollar coins and some examples from various years. Another short-lived collection was that of Star Wars trading cards (even as a kid I eschewed sports-related things); I don’t even think that lasted through a full summer before I traded them away for something else.
In college, though, I started collecting for real. I fell victim to this newfangled ideaa card game where you collect the cards. This was, of course, Magic, the Gathering™, around the Alpha/Beta transition, and before I had even heard of the CCG acronym. My college roommate, Naval, would purchase boxes of card, sell the packs, and then trade with the people as they opened their packs. Well before Kyle MacDonald hit the trading scene, Naval turned one island into a Shivan Dragon. I did my own purchasing and trading. Even now, I have a Fleer binder of nifty cards and a white box of extra cards in my office, for lack of a better place for them. My friend Brian, who has alternative names for most everything, refers to Magic as cardboard crack. I’m pretty sure I was addicted or close to it while in college; any extra money I had from my computing jobs on campus would go into it. Later, I’d try out Jyhad and Middle-earth, and to a lesser extent Dr. Who, Star Wars, Star Trek, and BloodWars, before I ultimately swore off investing in any new CCG.
After college and after collectible card games, I got found myself back into coins by what I would have considered to be an unlikely source for me buying anything, Shop At Home TV. I usually just tell my TV to omit those shopping channels when setting it up for the first time, so I’m not sure exactly how it was that I came to be staying up, late at night, watching the Coin Vault. Somehow I got drawn to buying a whole slew of silver coins (mostly brilliant uncirculated), and only later discovered that there was a hefty markup. Nonetheless, it piqued my interest again, if only mildly, for numismatics. I certainly enjoy getting each year’s silver eagles for depositing into my Dansco album.
In addition, my prediliction for Zeppelin1 paraphenalia, which is a collection2 in its own right, bled into coins with regard to the 1930 three- and five-reichsmark coins, which feature the dirigible and laud its 21-day circumnavigation of the world in 1929. I have to say, it’s pretty difficult to find those coins on this side of the water. With some rusty college German, I can partially navigate some internet sites and eBay sales. Ron, of germancoins.com, has been a good resource, both in terms of coins and how to bridge the gap between US gradings and German gradings. One of my early morning tasks today is to perform my first international wire transfer, to pay for a recent winning auction bid for such a coin.
It still seems a little crazy to be trying to guess whether or not an online vendor is trying to gouge you (and by how much) without a price reference guide, or even without necessarily many vendors selling the same items. It makes for some hefty search sessions (and my browser tabs increase to the point where you can only see the first three letters of the title of the page), and even then, it still sort of feels like jumping off the deep end, especially in the realm of coins, where you’re trying to double-check the quality grading by inspecting a grainy submitted picture, that isn’t particularly zoomed in. Nonetheless, I’m trying it out, and we’ll see how well it all turns out.
1 The airships, not the band.
2 I had always been a fan of airships, as long as I can remember. But it was only during the collecting of some material for a game of Nobilisentitled Means and Ends, it had a chancel which was a modified version of Seattle Center circa 1962 during the World’s Fairthat I branched out from Life magazine copies and into Zeppelin books. Of course, the geekery doesn’t stop there; I’m also a member of the Lighter-than-air society, and I regularly wear a Save Hangar One t-shirt in the hopes they don’t dismantle Moffett Field’s historic hangar.
Tuesday, January 09, 2007
This isn’t to say that the SP3i wasn’t nifty at one time. When I was willing to spring for the $20 extra a month it cost for insanely slow wireless internet service, it synchronized nicely with my Exchange account at work, which was pretty much the only way I ever used its connection. Trying to do anything other than load an extremely simple page (usually one written to be displayed on a Palm) (1) took forever and (2) required scrolling all over tarnation with the dinky joystick of intermittent operation. Said joystick came out of the phone multiple times, and could never quite be put back in correctly, and right now I’m operating with the plastic stub instead. Still, the difference between me knowing I had a meeting to get to and/or to look up where it was or what someone’s number is in my 400+ contact address book was huge.
On a parallel note, the implementation of as-needed-synchronization in Windows Mobile™ 2003 Second Edition’s Pocket Outlook depends not on the ultra-slow network that I paid $20 a month for at all. Instead, the Exchange server sends invisible SMS messages (ones that don’t show up in your text message section) that get interpreted by Pocket Outlook to let it know there’s something new to synchronize. I didn’t have a text plan when I first got my smartphone, and after two months of $150 charges in text messages that I had no idea where they were coming from, I finally switched to mode of synching to just every 30 minutes. My comrades in Windows Mobile ultimately changed the behavior of this in a later edition of Windows Mobile™, which is great, but unfortunately, they weren’t willing to upgrade my phone (they were OK upgrading the SP3, but the SP3i was just too something (new? freaky?) And as far as I can tell, the manufacturer won’t upgrade it, presumably because it was already an older model.
As time went by, I gave up on the Internet connection, trying to settle on desktop-synchronization. Unfortunately, it was about this time that I played Good Microsoft Employee™, and dogfooded beta candidates of Vista, for which there was no ActiveSync. Now that Vista has released to manufacturing, there’s now an ActiveSync, but it doesn’t sync contacts or events anymore. “The synch status is on the device.”, only it’s not. Add all of this to the device no longer responding to right button clicks, and taking on the order of 5 to 20 seconds to respond to left button clicks when it hasn’t been used in a while (is it paging things in?), I’m starting to take a hard look at new cell phones.
One of the most annoying things about bar phones, IMNSHO, is the requirement of locking them. Yes, I know it’s not a strict requirement, but I put the things in my pocket and then they auto-dial someone in my contact list. I’m still waiting for an accidental call to 911 to happen. Strange things happen when it is locked though. If I get a reminder, sometimes I cannot dismiss it because it wants you to unlock it first, but the reminder is on top, so you don’t get the unlock UI. There might be some kind of lock timer I could use, but that doesn’t preclude me pocket dialing people, and I don’t always remember to lock it. I was thinking I would have to go to some kind of flip or clamshell phone to avoid that nuisance. I spent some time over the holidays looking for a relatively cheap phone (since I’d have to buy it as an “upgrade” through T-Mobile or get a compatible unlocked phone, as I already have a contract), but hadn’t come up with anything.
And then Steve Jobs shows off the new iPhone.
Something tells me the Exchange support might not be all that great. As it stands, Mail.app has never once been able to synch with the Microsoft Corporate Exchange servers. (In the past, that’s meant that I’ve been even more tied to Entourage and/or Outlook, and had a vested interest in fixing/reporting bugs in those products.) I somehow doubt that the iPhone is going to do better than Mail.app. But still the idea is alluring. The iPod I got as a ship gift for shipping Microsoft Office 2004 ended up going to my wife who lives for her podcasts, and her old iPod had ended up with battery issues. (NB: She recently got it re-batteried and now I have an ancient iPod that I really should load up with some tunes.) Having a newer model iPod that can show video and call folks and can fit in my pocket sounds great. And maybe I can even leverage Entourage’s Sync Services integration to get my info onto the phone
I guess I’ll keep an eye on it, and if/when it comes out, figure out whether it’ll be worth switching to Cingular or whether it could be purchased unlocked for use with T-Mobile. Until then, I’ll live with my ailing smartphone.
P.S., What would be really cool, would be if it had a slot for not just a SIM card to enable the device, but extra SIM cards to serve as generic smart cards. Then you could use it as your physical token for authentication on machines. Can you imagine walking up to your machine and have it auto-negotiate credential sharing, and have a keychain show up on your box?
Thursday, January 04, 2007
Digging out the old 601 manual and peering through its pages, I realized that I missed dabbling in assembly. I say dabbling because I know just enough to be dangerous, and not enough to wring the performance you’d want from, say, an optimizing compiler. Fortunately, all I had to do was play with stack frames and arguments placement. Mmm Nothing like screwing that up and getting a completely broken stack trace at the point of a problem. Well, anyhow, as it turns out, whereas it’s possible to use my new shiny MacBook Pro to do this work, it’s quite difficult to debug in Rosetta, and Rosetta seems to not be too happy about programs generating their own code (it quickly runs out of memory—perhaps it doesn’t know where the data ends and the generated code begins?). Thus, my old PowerBook was going to have its one last hurrah over Christmas, so I could work during downtime while visiting family.
I greatly overrated the downtime, which ultimately didn’t matter, as for some reason gdb on my PowerBook wouldn’t load the symbols of the code I was debugging, so it was even more difficult than it was going to be back at work. I gave up on working over Christmas and focussed it on my and Miller’s family.
As an aside, it seems that around my birthday, Miller and I “spawned a new process”. I don’t think it’s Intel inside there, but according to the doctor, it’s set to release in late July ‘07. Both of our families are quite pleased.
Now that the last vestiges of a reason to keep the old workhorse is gone, I finally used the photos I snapped of it several weeks ago, and put it up on eBay.
<shameless_plug>If you’re looking for an old MacBU warhorse, replete with a Moby sticker, bid away.