PCs

Ubuntu 11.04: Small issues, big win

Jack Wallen recants his original take on Ubuntu 11.04 and Ubuntu Unity. See what happened to make him so drastically change his mind about the upcoming release from Canonical.

For the last few months, on nearly every site I blog for, I have been saying that Ubuntu 11.04 was going to be a big setback for Ubuntu. This "setback" was mostly due to Canonical's decision to use Unity as the default desktop. This decision sidestepped GNOME and GNOME 3 all together. Well, after using Ubuntu 11.04 beta 1 for a few weeks now, I have to say I was wrong. Although there are a few weak spots in the release, this beta release has gone a long way to showing me that Ubuntu hasn't fallen off the tracks, jumped the shark, or is about to lose its way. In fact, Ubuntu 11.04 will remain king of Linux for new users as far as I can see.

Ubuntu Unity

Figure A

After all this time spouting off how Mark Shuttleworth has made a huge mistake (switching to Unity), I am happy to say the mistake was really mine. But how? Why? Huh? Ubuntu Unity is a big leap away from the traditional desktop (See Figure A, left.) My biggest concern was that Unity would not allow the user to configure the desktop in as fine and detailed a way that the Linux user was accustomed to. There was a flaw in my logic. All along, I assumed every user spent as much time configuring their desktop as do I. Wrong! Most users just stick with the standard desktop they are given and maybe change the desktop wallpaper...if that.

Once I made that leap of logic, I decided to just install beta 1 and use it. No more concern with how much I can tweak the desktop or make it do exactly what I wanted it to do. This time around it was all about "just using it, just experiencing it."

With that intent in mind, the Unity desktop just disappeared. I was shocked to find out the intent of both the GNOME 3 and the Unity desktop worked wonders. The more the desktop "disappeared" the more efficient my work became. No longer was I focused on how the desktop was working, but focusing on how I was working on the desktop.

That doesn't mean it's perfect. Not yet. There are small bits and pieces that Unity needs to put in place. One such piece is the "Connect to Server" tool that was so much a part of GNOME. Without this wizard, a user will have to rely on shared network directories (thank you Samba) or will need to know to open up Nautilus and enter smb://ip_address/share in the location bar, instead of just opening the Connect to Server wizard.

Another issue is the OS X-like menu system. Now, when a window is open, the menu for the application is not trapped within the application window, but in the panel at the top of the screen. Although far from a deal-breaker, this will take some getting used to. I really noticed this when using The GIMP. In fact, The GIMP takes this one step further. When this application is opened, the menus seem to be no where in site. It wasn't until I right-clicked The GIMP's main window that the menu finally made itself known. A bit of inconsistency, but again -- not a deal breaker.

Surprising hits

There are a lot of nice little touches built into Ubuntu 11.04. For example, the Applications search window. This search system is not only linked to the installed applications, but also the Ubuntu Software Center. So if an application is searched and found, but not installed, upon clicking on the search results, the Ubuntu Software Center will open so the application can be installed.

Figure B

Ubuntu One also benefits from a new control panel (see Figure B.) There is now more second guessing if a system is associated with your Ubuntu One account, or if the system is syncing with the account. In fact, the whole of the Ubuntu One interface has been retooled even to include that one word I detest so much - CLOUD! But I'll let it slide this one time. And finally, Ubuntu One is better integrated into the notification system of Unity.

Final thoughts

I was taken by surprise at just how much I liked Ubuntu 11.04. Even with a few little glitches and/or changes that seemed counter-intuitive, the Unity desktop is amazingly transparent, so the focus becomes the work, not the desktop. And, with little surprise, the underlying system is one of the fastest and reliable systems I have used to date. And Ubuntu is putting out a beta release that puts many other final releases to shame, so you have to at least give Natty Narwhal a try. Even for skeptics like myself, Ubuntu Unity came a long, long way from it's last release to now. It works. I'm sold. I bet you will be as well.

About

Jack Wallen is an award-winning writer for TechRepublic and Linux.com. He’s an avid promoter of open source and the voice of The Android Expert. For more news about Jack Wallen, visit his website getjackd.net.

112 comments
bthegeek
bthegeek

a lot of people say that unity interface is not as good as windows 7's taskbar, but then there are features like workspace switcher, Ubuntu one and many others that are not in Win 7.I think the best way of switching to 11.04 is to stop comparing it with others (especially win 7) and just use it.

DHCDBD
DHCDBD

I think I will skip 11.04 for a while. I have read a few negative reports. As far as Ubuntu in general: I recently upgraded to 10.10 from 9.04. Ubuntu is blowing it. Xserver crashes routinely (I may have that straightend out.) The user control panel screws up about every other reboot and I have to remove it from the panel, delete and purge the applet from synaptic, and reinstall it. I have difficulty compiling a program because the 2.6.35-29 kernel uses the 2.26.35-12 headers. Just so many problems. I need to use the computer, not fix and repair on a daily basis.

andrew5859
andrew5859

I don't get it....everything after 9.04, no matter what distro it is, has managed to screw up with the Alps Dualpoint Touchpad...also known as synaptics. Yet canonical, KDE, Gnome, Developers and the like, have consistantly ignored anyones voice on keeping these drivers in their kernel from version 9.04...whether it's Mint, Ubuntu, Kubuntu, or whatever distro. My issue is that you can't take the cursor to the edge of any given window, and then stretch the window, thereby resizing it bigger. I've been told to use a wired or wireless mouse.....well, I don't want to have to use a wired or wireless mouse, that's why the laptop makers put touchpads in them so you wouldn't have to use a mouse.. Issues with Ubuntu 11.04 beta 1.....you can't add anything to the top panel, like weather apps, or any other small apps....you can't eve right click on it....it doesn't do anything. I realize that their trying to improve on looks and functionality, but the thing is, you can't have both and often times, you lose one for the other. I'm finding the developers are becoming a lot like our government, they make and create and decide what they think we need, without really listening to what we're actually saying or requesting, based on our needs for a better OS or newer version. Why then, do we have forums for all these different distro's when no one is really listening to the requests of those who need help. Better yet, why are these different distro's being put out before they're actually fit for public use, sent out with bugs and flaws.....it just seems really redundant....

Jaqui
Jaqui

is a show stopper, it is the primary reason I will NEVER buy a mac. application menus belong in the application window. right under the TITLE BAR of that window. never mind the second idiocy canonnical does, the rootless sudo config. until THEY FIX that security flaw config, Ubuntu is the worst choice for a distro. and I will continue to consider anyone using Ubuntu brainless.

lehnerus2000
lehnerus2000

The "great" new feature of Ubuntu is disembodied menus? I hate disembodied windows (e.g. GIMP when you run it for the 1st time). I don't think that I'm going to like disembodied menus. I have Ubuntu 10.04 installed, but I'm basically trapped in the GUI, because I don't know the CLI commands and I'm not familiar with the directory structure. Still, I'll put a copy of 11.04 on a CD at the end of the month (after they actually release it). I agree with some of the earlier comments about networking. I'm currently doing "Introduction to Linux" for my "Network Administration" course. Given that Linux only has about 20 settings, it's incredibly difficult to use. At least Windows has the excuse that there are 100s of settings. For example creating a location that anyone can dump files to (or read files from) seems to be impossible in Linux. In Windows it only takes about a dozen mouse clicks.

damian205
damian205

I dont understand the obsession with OSX or with apple products in particular. The menu structure is not particularly logical and when working with multiple windows simply becomes an annoyance as moving to the top of the screen after changing windows is frankly a pain, particularly after working in Windows 7 and using the snap feature. The frankly awful battery life of iPhones and the restrictive nature of the iPads and iPods has to make me question why Mark Shuttleworth would strive to copy this system. Restricting choice and forcing an ideology is never a good thing. I realise of course that one does not have to choose Unity and can wait for a respin but most users simply use what is in front of them and what is in front of them is simply not that good. I have a Mac Book Pro running Snow Leopard and to be honest it really is not all that productive. I have also run Win 7 on the same hardware and frankly much prefer it. But at least the option to right click is back. At work we use Centos 5 on our servers and Ubuntu 9.04 on our customer access machines with very little difficulty. Familiarity with OS's is not a problem and we recognise the value of Linux in general but to be honest, at present Win 7 is my preferred OS. The handling of multiple windows, the vastly improved usability of the task bar over all the others OS's and general performance means that for me at least that Mark shuttleworth should be looking to emulate that and not OSX which is after all like XP, an old OS. This is not intended as a troll or invitation to a flame war.

loweaj.1
loweaj.1

.........about what other things you have dumped on and pissed all over, on your oh so many contributions on oh so many blogs, without actually trying it out for yourself. I think that would be be a really interesting blog-post. I'm no fanboy of any particular OS persuasion, so this is not that sort of rant. Like so many other tech oriented blogs, the moment I see this sort of nattering on about things not actually tried, it is not worth my time to read. You have had really good stuff in the past, so please try and stay relevant. I would hate to see you turn into some kind of content farm.

What the ...!
What the ...!

What is the name of the release after zealous zebra?

Slayer_
Slayer_

I know the next Mint update is coming, do you know if the in place upgrade works if you are using Mint2Win?

norm7446
norm7446

No, Because I didn't have a problem with it in the first place. As for me is once again goes to show how much diversity and talent there is out there in the Open Software Community. My hat does go off to Mr Shutlleworth, for taking a bold step in an other direction, that might yet bring the house of Ubuntu down the ranking's, as it will not appeal to all.

Adan_Ova
Adan_Ova

The thing with the search for apps in the repos and install it on the menu is in Linux Mint since the last year.

ddalley
ddalley

Until Ubuntu starts using a rolling OS release, I can't see myself being interested in it anymore. I hate having to format and start over (no upgrading). Thankfully, LinuxMint has taken a leading role with some of their recent releases and has moved away from Ubuntu for some desktops. Yay!

caridley
caridley

I have been using Ubuntu on some desktops, in VMs and on server hardware for some time now, since Version 8 I think. I was quite skeptical about Unity, and though I only have it installed in a VM at this point, I was pleasantly surprised and pleased with what 11.04 has done. Since Version 10, I have been pleased with Ubuntu as both a workhorse and also for organizing MP3s, video, video and audio streaming and general entertainment purposes. It fills both the server room and home PC roles very well. I have primarily worked in the MS environment since DOS 5, and every iteration since, so, I am well aware of the advantages and disadvantages of MS and Nix. Ubuntu, in my opinion, is what brought Nix to the masses and is doing a very good job of it and even managed to make it fun. 11.04 continues that course and delivers reliability and a distinct cool factor, whether as a workhorse, entertainment center or all around general purpose OS. Thanks for your review, and, yes, I was wrong about Unity. Misery loves company.

saquibng
saquibng

i am a fan of ubuntu and i use it along with win xp, but a major flaw is the sound quality while using voip which is working perfectly with win xp any help there will be appreciated and installing softwares

drkeshav
drkeshav

This may be a bit off the topic. Linux will be very popular among pc users if the installation of application is made simple like that in Windows. Even persons like me who are professionals find it difficult to install the applications properly and easily. In India, there is a great scope for Linux provided installation procedure is made as simple as that of Windows. Ubuntu is a great distribution which I have been using. All the best to the persons who are behind its development. Good luck.

Slayer_
Slayer_

My touch-pad works perfect, exactly like it does in Windows. Humorously, there is an update for my touch-pad drivers from HP, but I applied it once and it actually ruined the experience of my touch-pad, made everything sluggish, I reversed the patch. I rather like how in Nix you can tell it to disable the touch pad while you are typing.

PineappleBob
PineappleBob

Are you serious when you say Linux has like 20 settings and Microsoft has hundreds. As in settings for the dekstop or the entire OS? If OS, you are sadly mistaken, Linux has quite a large number of settings one can tinker with if one so chooses. I will agree; disembodied menus especially the Gimp are a huge annoyance, and I never liked Macs menu at the top changing with whichever app was forefront, just does not work for me.

rmerchberger
rmerchberger

"""but most users simply use what is in front of them and what is in front of them is simply not that good.""" That's the only thing I agree with... but probably not the way you think. That's how I feel about Win7 -- a lot of the "usability enhancements" you prefer I find counter-intuitive. I *much* prefer WinXP's taskbar over 7's, and the "snap" feature I find irritating, as quite often I want to move a window to the top of the screen without changing its size. I think Win7's inability of reverting a lot of the UI functionality back to a preferred XP method does just as much towards """Restricting choice and forcing an ideology""" as anything you'll find in the Apple camp. At least with Ubuntu, any preferred window manager is just an apt-get away. I, like Jack, did not like Unity the first time I tried it. Still don't. Admittedly, I only have about 10 minutes behind that particular wheel... I also don't like upgrading every 6 months either, so I plan on sticking with 10.04 LTS as long as possible. (That also saves me from worrying about downloading "respins" as you call them.) I don't plan on joining the "Unity 12-step program" anytime soon, but when I don't have a choice (and if it stays Ubuntu's default choice at the next LTS rollout), I may revisit the issue at that time. Until then, I guess I'll stick with Gnome as my "daily driver" and FVWM as my all-time fave...

Brian Doe
Brian Doe

They have until Autumn of 2017 to figure that one out (17.04 will be the ZZ release)!

Adan_Ova
Adan_Ova

The name after zealous zebra should be something with an aardvark =P

Neon Samurai
Neon Samurai

someone above mentions that Mint is going to more of a rolling distribution release though. Someone else must have mint2win installs to upgrade. If it's actually an outright rolling distro then you should be able to simply keep current through the package manager updates. Encase it helps, you might also consider a dist-upgrade (try on VM first of course). Debian isn't a full rolling distro but the version upgrade works well: # update your package list (aptitude update or your GUI app) # upgrade your packages (aptitude full-upgrade or your GUI app) edit /etc/apt/sources.list changing entries to new distro # aptitude update # aptitude dist-upgrade It should upgrade packages from the newer version, run any distribution upgrade questions it needs to ask and otherwise do it's thing. With Debian, I just search "lenny" and replace with "squeeze". I'm not sure how Mint specifies there repositories.

ultimitloozer
ultimitloozer

Why do you have to start fresh with a new release? Why not just upgrade?

fairportfan
fairportfan

Press alt-F2. type: "gksudo update-manager -c" (minus the quotes) in the dialog box. (This checks to see if there are new stable versions. If you also use the "-d" switch, it will also look for beta {or even alpha} versions). Click "RUN".

Brian Doe
Brian Doe

Software installation is inherently different between Windows and Linux. Windows users are used to browsing the internet looking for applications to download, and when they find it and download it, simply double-clicking it to install the application. Linux, particularly Ubuntu, uses a different approach. Ubuntu maintains a vast software repository. In most cases, if you want a particular application, all you have to do is open the Software Center, find the app you want, and click Install. It's very easy. In fact, downright iPad App Store easy. Where things do get dicey, and I agree it can become really difficult, is when a particular application is not in the repository. You have a few options here though: See if the application is offered in a third-party repository (Launchpad PPA's are a massive collection of third-party repositories), add the repository to your sources list, then reload your Software Center; or you can see if the app you want exists as a .deb package, which is the functional equivalent to the Windows Installer. If the app in question is neither available in a repository nor as a .deb package, then you'll have to compile, package, and install it yourself. That IS a royal pain in the rear. Fortunately, this really isn't necessary all that often.

Neon Samurai
Neon Samurai

Honest question but how do you find software installation different between Windows and Linux based systems? Are you donloading packages from websites then installing them manually? Have you looked at the package manager for what is available in the distribution's repository? I've heard this before so I like to get more details when it seems like an honest user gripe. Encase it helps: The prefered normal method is the network repository. You have a package manager installed to browse/search the listings for the program you want. Select it, select any other's you want, press OK to install all selected at the same time. Uninstall; just uncheck the previously installed and hit OK again to watch the package go away. - The equivalent would be Windows Update but if it included checkbox options to download any Windows software or at least any that had bast MS vetting. The Apple App Store, Google Market Place and similar are consumer packaged examples of network repository installation. Website package downloads are not idea but sometimes there are programs available which are not hosted in the distribution's repository. In that case, check that you have the correct package for your distribution. if it's a .deb based distro then you want the .deb file in 32bit or 64bit as applicable. Download it, double click it and you should be asked for the admin password then see the package manager take over. - the Windows equivalent would be the age old "download setup.exe and run from desktop to install". This really isn't the easiest install method available but it is the most common. If it's a tarball (.tar.gz, .tgz, .tar.bz2) then things get more variable. You need to be aware if it's a source tarball you'll need to unzip then "./configure && make && make install" or if it's a binary you can just unzip and run. This is not desirable since it means going outside the distribution's vetted repositories and probably involves a higher degree of knowledge. - the Windows equivalent would be .zip with a setup script or manual install process; basically not as pre-packaged as a single setup.exe file that uncomrpesses itself and runs the setup steps. Portableapps are a similar implementation as tarballs containing executable binaries; uncompress to a folder and run the executable. Sidenote; VMware and Nvidia both do a very good job of delivering tarball installs with good setup scripts. VMware Server was no more difficult than the Windows GUI install wizard. Nvidia pretty much asks for confirmation, installs what it needs to build the video drivers then asks you to reboot; installing the Nvidia provided driver package under Windows was about the same complexity but with more pretty picktures and a swanky progress bar. These both show that install can be done through an easy wizard even with in the text only terminal environment. Installing from version management system would be another option but one I'm not suggesting for new or average users. In that case, one is getting a copy of the actual developer's working directory tree. When I use SVN to get my initial Metasploit install and future updates, I'm actually syncing down a copy of what the developers are working on; so bleeding edge it may include a newer version and plugins than the website's own metasploit.zip file download. For uninstall, if it was installed from a package then the package manager can probably remove it from the system. If I aptitude install vim, I can later aptitude uninstall vim or aptitude purge vim (uninstall and remove lingering config). I list a veriety of ways encase one of them helps but your really looking at install from repository or potentially install from third party package download.

lehnerus2000
lehnerus2000

I meant network (and file - thanks Jaqui) permissions. Owner: r-w-x Group: r-w-x Others: r-w-x Apparently there are hidden settings which we haven't covered in my course yet. Contrast that with the plethora of Windows permissions and Group Policy settings. As for the OS, I suspect that Linux should be easier to setup as you only have to deal with text files, as opposed to Registry GUIDs.

damian205
damian205

If you dislike Aero Snap you are able to turn it off. Begin by clicking the Start button and typing Snap in the Start Search box. When you do, you???ll see a result titled Turn Off Automatic Window Arrangement. When you select that result, you???ll see the Make the Mouse Easier to Use panel in the Ease of Access tool and can select the Prevent Windows from Being Automatically Arranged when Moved to the Edge of the Screen check box. With regard to Apple I was particularly refering to the practise of locking ipods etc to one computer, one choice of software, one choice of format. Likewise with the app store. With Windows you are at least able to choose MP3, OGGVorbis, AAC, WMA etc and find some software that you like to use it. You will notice that I did mention that for specifically customer facing machines we are still using Ubuntu 9.04 and we are still using Gnome as the interface. This for all customers and no one has ever asked for help despite the fact that it isn't Windows. For speed and ease of use my own favourite is Puppy Linux but that's just me and I guess the great thing for people like us is that we are fortunate enough to to understand that we have a choice and have enough knowledge to exercise it.

seanferd
seanferd

That is the only rolling Mint of which I am aware. And that's Debian (testing). All the rest are static, excepting for maybe some of the XFCE edition which is using more Debian and less Ubuntu as of 10. Usually Mint just tells you if there are updates, just like Windows, from the systray. I have no freakin' idea how Mint2Win handles any of this, but I'd think that once Mint is installed, it would behave line a "normal" installation. http://www.google.com/search?ie=UTF-8&oe=utf-8&q=linux+mint+upgrade and http://www.google.com/search?q=upgrade+wubi+installation

Slayer_
Slayer_

Of the fake HDD's it has set up for itself?

ddalley
ddalley

With a rolling release, I don't have to start over. I upgraded Ubuntu once, but then I started reading the horror stories of people who attempted to upgrade and failed - miserably. There are risks is upgrading, although, today, these may be less than what they were a few years ago. I didn't have trouble upgrading, friends have not had trouble upgrading, but you do run risks of having it fail that I don't have to worry about with a rolling release. Rolling releases also can give trouble, too, depending on what the release is based on, but those I am willing to suffer. Once you have upgraded an install once, I am told that you really shouldn't upgrade it again, so I ignore these possible problems, as well. Re-Installing is something I prefer to avoid and, today, it is easier to avoid doing so. It's my choice.

Jaqui
Jaqui

[b]if it's a .deb based distro then you want the .deb file in 32bit or 64bit as applicable. Download it, double click it and you should be asked for the admin password then see the package manager take over.[/b] seems apt package manager lacks that install from any folder. you need to jump through hoops and set up a local repo, then create package list info for it and make sure you update package info every time to put a new package in the local repo. then it's just run the package manager. but double clicking on a .deb always throws an error, the system doesn't know what to do with it.

lehnerus2000
lehnerus2000

"I was saying that Unix-like OS filesystem permissions are more extensive in what they do than MS Windows filesystem permissions; Unix-like OS file attributes are more extensive in what they do than MS Windows file attributes; and saying that ACLs on MS Windows do more than file attributes on Unix-like systems, so Unix-like systems do not do as much for dealing with how files are managed, is ignoring the fact that Unix-like systems also have ACLs." OK, in that case I agree with you. Thanks apotheon. :) Meh, Ubuntu is ... OK.

apotheon
apotheon

> So what you are saying is, Linux uses permissions to do the same things that Windows does with ACLS? I was saying that Unix-like OS filesystem permissions are more extensive in what they do than MS Windows filesystem permissions; Unix-like OS file attributes are more extensive in what they do than MS Windows file attributes; and saying that ACLs on MS Windows do more than file attributes on Unix-like systems, so Unix-like systems do not do as much for dealing with how files are managed, is ignoring the fact that Unix-like systems also have ACLs. > I installed it in WMware Workstation, but I was unable to run Unity. I've never used Ubuntu 11.04 or Unity, so I have no comment to make about this. Frankly, I think Ubuntu is a steaming pile of crap in general, trying so hard to beat MS Windows at its own game that it is throwing away the benefits Unix-like systems have over MS Windows with gleeful abandon.

lehnerus2000
lehnerus2000

So what you are saying is, Linux uses permissions to do the same things that Windows does with ACLS? Our instructors refer to Windows ACLs as permissions (DACL) and rights (SACL). Therefore the terms have different meanings in Linux and Windows. Windows file attributes have very limited scope (Read Only, Archive, Hidden, Encrypt or Compress). As for the topic, I just downloaded the Ubuntu 11.04b (64bit) iso. I installed it in WMware Workstation, but I was unable to run Unity. Therefore it seems basically the same as Ubuntu 10.04 (64 bit).

apotheon
apotheon

Yes, hence the ACL parts of those names. My point is that MS Windows ACLs are not analogous to the parts of the file attribute system that Unix-like systems have and MS Windows lacks. You started out saying that somehow Linux-based systems do not have as many permissions capabilities as MS Windows, but the reverse is actually true.

lehnerus2000
lehnerus2000

Hence *ACL. In Linux the way we have been instructed is: You create everything as Root. You create a Samba User and Password. You create a Linux User and Password resource. You reassign the File Permissions to allow the user to use the file.

apotheon
apotheon

That's a separate matter, and also covered on Unix-like systems.

lehnerus2000
lehnerus2000

You can do the same in Windows (not with the attrib command though). The CLI command is the awful ICACLS (the GUI is much easier to use). Remove everyone except yourself from the DACL window (Security tab in file/folder properties) or restrict their options/rights. Don't remove Trusted Installer (if it's present). :) The DACL has 6 settings and the SACL has 14. I'm not saying that Windows is better or more secure than Linux. I'm saying that in a specific situation on "my LAN", Windows is easier to setup.

Neon Samurai
Neon Samurai

No one has mentioned attributes yet. In windows, one can use the attrib command and get five or six? (read only, hidden, archive...).. Try the attrib command on a Unix like system and your looking at ten or more file level attributes including things like "only permit changes by original creater".

apotheon
apotheon

The file permissions in Unix-like systems are equivalent (in purpose) to the file permissions in MS Windows -- not Active Directory Group Policy settings or anything like that. If you want something equivalent to AD settings, you need to look into LDAP and Kerberos, which happen to be the protocols on which AD is based anyway. Then, of course, there are ACLs, system hardening tools like SELinux, and so on. By the time you're done, you have at least an order of magnitude more options for fine-grained control of access permissions on open source Unix-like systems than on MS Windows.

lehnerus2000
lehnerus2000

My bad (thanks for that correction). You still need to set those in Linux and Windows to access resources via the network.

Jaqui
Jaqui

those are filesystem permissions for everything.

Slayer_
Slayer_

There is a lot of room in Mint for other performance issues to be fixed first. I am tempted to take a video of some of them.

Neon Samurai
Neon Samurai

If performance is that important, you might want to look at mounting a physical disk as your VM's disk versus mounting a blob file. I know VMware clearly offers that ability though I haven't had reason to look for it with Virtualbox. The better solution may be to drop your VMs on SSD and accept the combination of higher writes and SSD write limits causing more frequent drive replacements or going old school with RAID0 to max out the possible performance of platter drives. Without tweaking your hardware setup, I invision the two or three layers of defrag previously mentioned as working like this: Host OS defrag moves the blob file to the start of the drive as a consolidated block so you get less head movement during read/write. Guest OS uses a hard set drive size instead of expanding so your not adding undue fragmentation during drive size increases. Guest OS does it's regular defrag inside the blob file. You get the blob file at start of platter and the applicable files within the virtual hard drive at the start of the virtual platter. I'm not sure if VMware's .vdi defrag process affects where the blob is on the platter or simply consolidates the blob fragments but that factors in also if in use. Either way, I'm not sure that you'd be seeing much real performance increase unless your doing an industrial ESX type dedicated VM server.

Slayer_
Slayer_

So if, for example, the filesystem was trying to improve performance by moving files closer to Sec 0 (unlikely, but helps my explanation). The physical location on the platter could be anywhere, and it could be fragmented. It would effectively be lining up files across the fragmentation, and actually slowing it down. Fortunately, mine is only in 3 fragments, so its probably not that bad.

Neon Samurai
Neon Samurai

A virtual filesystem knows where virtual sector 0 is the same way a filesystem hosted on physical media knows where physical sector 0 is. The physical hardware presents the hard drive to the OS which mounts the partitions to access the contained filesystems. The VM software presents the virtual hard drive to the guest OS which mounts it's partitions and accesses the filesystems contained within. Consider Virtualbox or VMware. When you create a VM, you can create a hard drive for it to install the guest OS on. From inside the VM's booted OS, you see a hard drive. From outside of the VM software, you can see that it's actually just a blob file (.vdi or whatever format). The VM can't suddenly start writing to the host's physical hard drive because it can't see outside of it's virtual world. Consider an ISO file. It's a binary duplication of a drive; normally a CD or DVD. Sector 0 on the DVD is the same as sector 0 of the disk.iso file. When drop the disk in the DVD reader, your limited to the partition start and end on that disk even if it doesn't take up the full available space. if you mount the ISO file directly, you are working within the limits of that contained starting and ending point. In the case of Mint4win, it's designed to use a blob file as it's "hard drive" so it's not looking for sector 0 of the hosting media but sector 0 of the virtual hard drive. It doesn't care that it's hard drive is simply a file on a bigger hard drive because it's not operating at the same level as the host OS. It's operating within an emulated environment and limited to that. Anything you do within the booted Mint4win window should be limited to within it's hard drive space. My guess is that if you initiate an upgrade process from within that booted Mint4win it will be limited to that blob file because it thinks the contents of that file are it's partitions and filesystems. Now, if your initiating an upgrade from outside the Mint4win booted session.. no idea.. I'd hope that an external upgrade process would be aware enough to read the old config and mount the blob file or ask to be directed to the "old versions hard drive" as it where.

Slayer_
Slayer_

How can the virtual file system know where the drive is. Sector 0 for the blob isn't really sector 0. So fragmentation of the blob file, coupled with the natural fragmentation, and probably made worse by the OS's defragmentation probably disorganizing the files even more as it can't really tell its own position on the platter.

Neon Samurai
Neon Samurai

You have the fat32/ntfs fragmentation of the container file plus the natural fragmentation in the ext3 or whatever is inside the container file. On the up side, if it's ext3 it's doing background defrag so your primarily looking at the container file on top of it's host filesystem. How much moving around hits that file when you run your defrag? (I'm partial to MyDefrag due to good sort order and detailed display including which file is under the mouse pointer). Since mint2win is doing a single blob file, I'd just make sure to start the upgrade process from within the booted Mint session if one isn't going to do a test VM for confirmation.

Slayer_
Slayer_

It makes a massive blob file on the drive, and does its work within. I suspect the fragmentation is rather horrid but it works very well anyways.

Neon Samurai
Neon Samurai

If it's a VM your using then it shouldn't suddenly start working outside the VM. If it does, your VM software is severely broken. When working in a VM, nothing should be unexpectedly able to reach outside that VM's environment. You should be fine to break a few VMs trying the upgrade process before going anywhere near your production VM install. To be sure though, I'd stick with ISO mounted into the VMs rather than mounting the physical disk drive intot he VM. Then you can be sure that all is contained from boot to shutdown since you'll only see the resources through the VM's instance. With mint2win, I'm not sure how it's doing it's magic. If it's crating a fake HD inside a big blob file then all should remain contained since the upgrade process is running within the Mint (VM? Emulation?). That's why I originally suggested doing a mint2win install under a fat32 or ntfs VM so you can actually see how it behaves.

Neon Samurai
Neon Samurai

dpkg -i filename.deb since apt-get and aptitude want the repository rather than local file. I thought Ubuntu and such where doing double click installs from the GUI package manager though. Admittedly, my last experience with GUI management was mandriva; drop your .rpm on the desktop, double click, enter root password. For all the other decisions Canonical has made, I'm rather surprised that they haven't implemented this. I'll have to boot up an Ubuntu liveCD and have a look.