4.0.3

Hi :slight_smile:
I think Windows will continue to dominate the desktops and full laptops, maybe even down to notebooks.

Netbooks died off for 1 reason only.  That was because people wanted Windows on them and then found that the machines ran ridiculously slowly.  Any version of Gnu&Linux, even Ubuntu, made the machines really fast and seemingly powerful.  People weren't ready for non-Windows back then.  However they were a good ice-breaker.  Now people are happy with even smaller and less powerful devices.

So, while Windows dominates the desktops we keep hearing that the "age of the desktop is over" and it's already being reported that comparing the amount of time people spend on desktops compared to using hand-helds has already tipped in favour of the hand-helds and other mobile devices.

Regards from
Tom :slight_smile:

Hi :slight_smile:
I have to disagree.  Amit does have some good points even if some minor details are not entirely accurate.

It's a subject we often argue about here.  Yes we do need to follow MS's lead and keep working at greater and greater compatibility with their formats and their ways of doing things.  That is why we do invest a LOT of time and resources into doing exactly that.  Amit is right.

However their format does keep changing around a bit between one release of their program and the next.  It's unpredictable despite the name of their format staying the same and despite them having acquired the ISO stamp of approval for the name of their ever-changing format.  So they make 1 small tweak here or there and keep everyone busy trying to guess where the change is and how to read it now.

The main problem is that if we always follow MSO's lead then they will always be in the lead.

Regards from
Tom :slight_smile:

Urmas wrote:

So the open document standards were
born and ratified and accpeted by the majority of the world that counts.

Microsoft is using an open standard format called OpenXML. Stop
pushing your vendor-locked ODF crap here please.

You may want to read up a bit on OpenXML and how it was rammed through
ISO and also how Microsoft themselves don't follow it. The way it
became a "standard" was an absolute farce that crippled an ISO committee
that's also supposed to handle other stuff. Are you aware OpenXML
perpetuates that year 1900 Excel bug and makes it standard? Did you
know about all the non-disclosed binary blobs that are part of OpenXML?
Now tell me again who's pushing "vendor-locked crap".

http://www.groklaw.net/staticpages/index.php?page=20051216153153504
http://www.groklaw.net/staticpages/index.php?page=20080719233709726

Urmas wrote:

Another major reason are huge bribes given to government officials to
deploy {Libre|Open}Office solutions in budget-funding institutions
worldwide.

Actually, if you care to check the facts, it's Microsoft that's been
doing that.

Tom Davies wrote:

Netbooks died off for 1 reason only. That was because people wanted Windows on them

Actually, there was a bit of MS strong arming manufacturers as well. I
have an Asus Eee PC, which I loaned to a friend. She loves it, even
though her computer experience had previously been only on Windows.
Look at the efforts MS is taking to keep other operating systems off
newer systems with UEFI requiring signatures provided only by MS.

Andrew Brown wrote:

yes where MS currently dominates, but not for long.

Of course one also has to look at why MS dominates. The reasons include
strong arming and extortion. MS has long been an unethical company,
going back to when Bill Gates and Paul Allen developed a BASIC
interpreter for the Altair 8800 computer. MITS, the company that
produced the Altair hired Bill Gates and Paul Allen to create the
interpreter. They developed it on the Harvard computers, in violation
of policy that those computer not be used for commercial purposes.
Then, after BG saw the commercial potential for BASIC, claimed he owned
the software MITS had paid for. Then in the DOS/early Windows days, MS
required computer manufacturers to pay for a licence on every computer
sold, regardless whether it shipped with DOS/Windows. There was also a
test in Windows to see if it was running on DR-DOS and would then throw
up a bogus error message. Then there was the issue of Windows API,
where MS apps ran better than competitor's, because MS used hidden API
that were not available to competitors. Or how they told WordPerfect
about the new Windows API, but changed them just before Windows 95
(IIRC) was released. Or how MS refused to licence Windows 95 to IBM,
unless IBM stopped promoting OS/2 (there's also the incident where MS
misappropriated IBM's money for OS/2 development to Windows). Or how
they claimed in court the Internet Explorer couldn't be removed because
it was part of the operating system. It wasn't at the time, but soon
became so, opening up all sorts of security problems is in the process.
More recently, with site licences, MS requires all computers to be
licensed, even if they can't run the software. The list of MS unethical
behaviour is a long one.

And some more info and statics with charts, to refudiate your claims of
Windows XP and Vista (the latter also a disaster for MS) of combined market
share.

http://www.statista.com/topics/823/microsoft/chart/
799/market-share-of-selected-**windows-operating-systems/<http://www.statista.com/topics/823/microsoft/chart/799/market-share-of-selected-windows-operating-systems/>

And it shows what you know of Linux. FACT, it along with various flavours
of Unix, power the known global Internet servers, Observatories,
MET/weather offices, Space exploration, the Mars machines, the majority of
military machines/equipment, medical equipment, and lo and behold a good
number of desktop, laptops around the world, and the no.1 O/S for mobile -
Android, followed shortly by Firefox O/S and Ubuntu Touch.

You might also include iOS, which is a stripped-down version of OSX, a
flavor of BSD.

Hi :) 
wrt Virtual Memory/pagefile.sys/Swap on Windows the trick seems to be to set it as a fixed value.

Find
"System Properties" - Advanced tab - Performance (top 3rd) Settings - Performance Settings - Advanced tab here too - Virtual Memory (bottom section) Change
There will be about 3 pop-ups open around now.

Use the radio buttons there to change to a "Custom size".  This really needs to be greater than Ram but not more than 2xRam (else it gets confused and may even reduce performance while tripping over it's own shoelaces).  It has to be greater than Ram because when hibernating (perhaps sleeping too?) the contents of Ram gets written to Virtual Memory.  But giving it too much just confuses space just confuses things so just under 2xRam is good but over that might get annoying.  Make sure the same number is in both the top and bottom boxes.  Often there is a recommendation for how much to set it too and it's usually not a bad idea to follow that advice.  I've only seen it give a crazy suggestion once or twice out of hundreds of machines.

Ok, now it gets a bit fiddly.  You have to click on the "Set" button before clicking on "Ok" otherwise it forgets and you have to re-type the numbers again.  Then you click "Ok" on each of the pop-ups in turn.  Again if you don't it's not harmful, just annoying because it forgets.

Of course if you have already been using your machine for a while then Virtual Memory is already quite fragmented so this will only 'stop' it getting worse.  It wont improve things.  Also when i say 'stop' it will continue to suffer normal system rot and there are other factors such as registry fragmentation that will continue.  So, it fixes just 1 problem out of many.

When trying to resurrect an ancient and much used machine i would initially set Virtual Memory to 0.  Then defrag quite a lot and then plonk a fairly huge file onto the system.  Then reset the Virtual Memory to a respectable size and get rid of the huge file.  In theory i hoped that would force all the Virtual Memory file to be contiguous and out of the way.

Gnu&Linux does NOT SUFFER from fragmentation until the drive is something like 96% full, not sure of the exact figure but definitely over 90% (it's always that extra just 1 episode/movie of Star Trek).  Files might well be fragmented much lower than that despite the elegant way that files are carefully placed in Ext2,3,4 with plenty of room all around them to allow them to grow.  There is a limit to how much that policy can really work of course.  However even when files are fragmented there seems to be a better system for tracking where all the bits are so the read/write head can anticipate and plan ahead a bit better.

So what i find odd is that despite that Gnu&Linux doesn't use a Swap file by default!  One of the main rules in Gnu&Linux is that for any 'rule' there is always at least 1 version or distro or something that deliberately breaks that rule but in the case of Swap i haven't found one yet.  They all seem to follow it!  They all seem to use a separate Swap partition or don't use Swap at all.

In Windows, which can't cope with fragmented files and couldn't (until fairly recently) defrag system files people insist on setting Virtual Memory to fragment as quickly as possible.  Sometimes they set it to have a fixed lower amount and only vary the top-off but that still means the file gets read and re-written elsewhere and fragmented.

Normally by default it's set to keep changing size according to how much of it is needed.  That sounds good in theory.  When you need more memory it just expands to fill up more hard-drive space when you need less it releases some of it.  You can get Gnu&Linux to use a swap-file just the same instead (or as well as) having a separate fixed swap partition.  Unfortunately Windows file-systems such as the various Fats (vFat, Fat32 etc) and Ntfs are carefully designed to make sure files fragment quite quickly and end up with bits scattered all over the place.

Say you have file A that is 20units long and the next file B is 10.  Then you delete A and write a file C that is 30 units.  Now you have 20units of C followed by 10 units of B followed by the remaining 10 of C.  If you now delete B and copy A back then you get 20 of C, followed by 10 of A followed by the 10 remaining of C and then the last 10 of A.  So when you try reading a file the read/write head lurches around the drive trying to find the various shopped up parts of the file.  If that file is a frequently accessed system file such as Virtual Memory then it can significantly reduce performance.

In Gnu&Linux it is reckoned that you can significantly increase performance by putting your system files, particularly your log files, on a different hard-drive from your data.  i mean a proper hard-drive not just a different partition on the same physical device.  The main reason for putting your data (all in /home) on a separate partition is not to do with routine performance.  it's more about making the system more robust.  it allows you to install a completely new OS without any risk to your data (but still back-up anyway of course).  In theory you can have several different OSes all using the same /home although that gets a bit messy if they have the same DE.  it works a bit better if you have 1 KDE one, 1 Gnome(ish), and maybe 1 of any of the rarer ones (does Unity count as 1 of the rarer ones? i'd say it does but i'm sure others disagree).  Otherwise you find all your different OSes use the same wallpaper and look the same (big yawn that
is) and you don't get the benefit of the different design teams interesting work.

Something i haven't really tried much, or at least can't remember the result, is putting all the Virtual Memory on a separate physical hard-drive.  There is an option to split Virtual Memory across several different hard-drives/partitions some of which might be physically different drives but i'm not sure whether doing that is good or bad.

Errr, i haven't mentioned Bsd or Apple because i just haven't played around with them that much.  They don't seem to slow down as much as Windows so i guess they have a similar set-up to Gnu&Linux or have some neat work-around that might not translate well to Gnu&Linux let alone Windows.

Regards from
Tom :slight_smile:

Hi Jomali

Agreed iOS and OSX are built from portions of BSD, as well as NextStep (OpenStep, once Apple bought out the Next company), but it also does not play a major role in mainframes and servers, but I concede it did for a long time dominate the tablet market, and raised the desktop and laptop computer level to take a slice of the pie away from MS.

Regards

Andrew Brown

Hi Tom

Ah Ok, I see, this is the same methodology I'm using. I generally turn off the swap file for a badly defragged drive, including any hibernation files etc if active or used on a laptop, then defrag (Disktrix UltimateDefrag, possibly the best I've used to date). After a good clean-up I then set the pagefile and any hibernation files if necessary.

With UD's FragProtect, this only has to be done every few months, and they are one of the few defraggers that can defrag and place the MFT at the beginning of the drive along with the folders entries, ahead of any data. But this has to be done with a reboot and MS pre-install mode (UD does it all automatically) to complete this task. And I've benched my drives on all of my systems, it certainly makes for very fast boot and shutdown times, and better stability.

Regards

Andrew Brown

Hi :slight_smile:
Is Disktrix UltimateDefrag free?  FOSS?  Lol, somehow i doubt it but i keep an ear out jic.

I tend to use the inbuilt Windows one.  I don't really care enough anymore to go beyond that.  When i did used to care i used  PerfectDisk.  it usually has a 1 month free trial and that was usually enough for me.  Nowadays i just really prefer to just do a reasonably good job and since that is far, far ahead of the way most systems are set-up i just settle for that.  I've even found a tendency for ones in England to be set to US localisation and such.

If i want a fast system i just reboot into Gnu&Linux.  Windows has other advantages but speed and security are not top of the list!

Eskimos have a lot of words for snow and ice because they see a lot of it all.  Windows has a lot of words for different security issues because it suffers from tons of different things.  [shrugs]  I still use Windows quite a bit though because when you know a thing's flaws it's usually easier to cope.  Like going round to see a cat owner who insists their cat is always free of fleas, you just know you are going to get bitten so you just deal with it.  
Regards from
Tom :slight_smile:

________________________________
From: Andrew Brown <andrewbr@icon.co.za>
To: Tom Davies <tomdavies04@yahoo.co.uk>
Cc: users@global.libreoffice.org
Sent: Wednesday, 31 July 2013, 23:01
Subject: Re: [libreoffice-users] 4.0.3

Hi Tom

Ah Ok, I see, this is the same methodology I'm using. I generally turn off the swap file for a badly defragged drive, including any hibernation files etc if active or used on a laptop, then defrag (Disktrix UltimateDefrag, possibly the best I've used to date). After a good clean-up I then set the pagefile and any hibernation files if necessary.

With UD's FragProtect, this only has to be done every few months, and they are one of the few defraggers that can defrag and place the MFT at the beginning of the drive along with the folders entries, ahead of any data. But this has to be done with a reboot and MS pre-install mode (UD does it all automatically) to complete this task. And I've benched my drives on all of my systems, it certainly makes for very fast boot and shutdown times, and better stability.

Regards

Andrew Brown

Hi :slight_smile:
wrt Virtual Memory/pagefile.sys/Swap on Windows the trick seems to be to set it as a fixed value.

Find
"System Properties" - Advanced tab - Performance (top 3rd) Settings - Performance Settings - Advanced tab here too - Virtual Memory (bottom section) Change
There will be about 3 pop-ups open around now.

Use the radio buttons there to change to a "Custom size".  This really needs to be greater than Ram but not more than 2xRam (else it gets confused and may even reduce performance while tripping over it's own shoelaces).  It has to be greater than Ram because when hibernating (perhaps sleeping too?) the contents of Ram gets written to Virtual Memory.  But giving it too much just confuses space just confuses things so just under 2xRam is good but over that might get annoying.  Make sure the same number is in both the top and bottom boxes.  Often there is a recommendation for how much to set it too and it's usually not a bad idea to follow that advice.  I've only seen it give a crazy suggestion once or twice out of hundreds of machines.

Ok, now it gets a bit fiddly.  You have to click on the "Set" button before clicking on "Ok" otherwise it forgets and you have to re-type the numbers again.  Then you click "Ok" on each of the pop-ups in turn.  Again if you don't it's not harmful, just annoying because it forgets.

Of course if you have already been using your machine for a while then Virtual Memory is already quite fragmented so this will only 'stop' it getting worse.  It wont improve things. Also when i say 'stop' it will continue to suffer normal system rot and there are other factors such as registry fragmentation that will continue.  So, it fixes just 1 problem out of many.

When trying to resurrect an ancient and much used machine i would initially set Virtual Memory to 0.  Then defrag quite a lot and then plonk a fairly huge file onto the system.  Then reset the Virtual Memory to a respectable size and get rid of the huge file.  In theory i hoped that would force all the Virtual Memory file to be contiguous and out of the way.

Gnu&Linux does NOT SUFFER from fragmentation until the drive is something like 96% full, not sure of the exact figure but definitely over 90% (it's always that extra just 1 episode/movie of Star Trek).  Files might well be fragmented much lower than that despite the elegant way that files are carefully placed in Ext2,3,4 with plenty of room all around them to allow them to grow.  There is a limit to how much that policy can really work of course.  However even when files are fragmented there seems to be a better system for tracking where all the bits are so the read/write head can anticipate and plan ahead a bit better.

So what i find odd is that despite that Gnu&Linux doesn't use a Swap file by default!  One of the main rules in Gnu&Linux is that for any 'rule' there is always at least 1 version or distro or something that deliberately breaks that rule but in the case of Swap i haven't found one yet.  They all seem to follow it!  They all seem to use a separate Swap partition or don't use Swap at all.

In Windows, which can't cope with fragmented files and couldn't (until fairly recently) defrag system files people insist on setting Virtual Memory to fragment as quickly as possible. Sometimes they set it to have a fixed lower amount and only vary the top-off but that still means the file gets read and re-written elsewhere and fragmented.

Normally by default it's set to keep changing size according to how much of it is needed.  That sounds good in theory.  When you need more memory it just expands to fill up more hard-drive space when you need less it releases some of it.  You can get Gnu&Linux to use a swap-file just the same instead (or as well as) having a separate fixed swap partition.  Unfortunately Windows file-systems such as the various Fats (vFat, Fat32 etc) and Ntfs are carefully designed to make sure files fragment quite quickly and end up with bits scattered all over the place.

Say you have file A that is 20units long and the next file B is 10.  Then you delete A and write a file C that is 30 units.  Now you have 20units of C followed by 10 units of B followed by the remaining 10 of C.  If you now delete B and copy A back then you get 20 of C, followed by 10 of A followed by the 10 remaining of C and then the last 10 of A.  So when you try reading a file the read/write head lurches around the drive trying to find the various shopped up parts of the file.  If that file is a frequently accessed system file such as Virtual Memory then it can significantly reduce performance.

In Gnu&Linux it is reckoned that you can significantly increase performance by putting your system files, particularly your log files, on a different hard-drive from your data.  i mean a proper hard-drive not just a different partition on the same physical device.  The main reason for putting your data (all in /home) on a separate partition is not to do with routine performance.  it's more about making the system more robust.  it allows you to install a completely new OS without any risk to your data (but still back-up anyway of course).  In theory you can have several different OSes all using the same /home although that gets a bit messy if they have the same DE.  it works a bit better if you have 1 KDE one, 1 Gnome(ish), and maybe 1 of any of the rarer ones (does Unity count as 1 of the rarer ones? i'd say it does but i'm sure others disagree). Otherwise you find all your different OSes use the same wallpaper and look the same (big yawn that

is) and you don't get the benefit of the different design teams interesting work.

Hi Tom

No it's payware https://www.disktrix.com/ but well worth the $30.00, and only necessary for Windows. If you do want to care again, and want a very good free version then Piriform's Defraggler is a great product http://www.piriform.com/. They've recently gone to a payware model, but still keep to their freeware versions, and they have three other great tools, CCleaner, Recuva and Speccy, also freeware or payware versions. I use them all in Windows (except Defraggler on my system as this is replaced with UD) as well as for friends and clients, and have never needed their payware versions. And they have never let me down, in trashing any systems I have used them on for the last five years.

Yep, my Ubuntu, with the pause at the login screen included and the fastest I can type my password, takes all of 20 seconds, shutdown about 10 seconds. Agreed Windows still has it's place, and I have to be familiar with it due to my business and support of my clients. I even have an old PowerMac to keep up to date with my few clients using Macs.

Regards

Andrew Brown

+1

Urmas - if you have evidence to substantiate your claim, please share. Otherwise, as James writes: you have LibO confused with the dirty tactics employed by MS.

According to http://store.steampowered.com/hwsurvey, Windows 8 is working on 14% of computers.
It's more than 10 times Linux marketshare.

Are you aware OpenXML
perpetuates that year 1900 Excel bug and makes it standard? Did you
know about all the non-disclosed binary blobs that are part of OpenXML?

It is much more better than an ODF, which documents nothing, and depends on a reference implementation from a single vendor, Sun.

Year 1900 being a leap one is a universal convention which predates Excel for several years.

Someone had asked about a free/FOSS defragger...

There is UltraDefrag:

http://ultradefrag.sourceforge.net/en/index.html

I don't use it much, but thats only because disk fragmentation is not nearly as big of a problem on modern systems as it used to be.

Windows7+ does a pretty good job of avoiding fragmentation all on its own.

Urmas

Steam is a game platform, and here's some facts from their own info web page - quote "As of December 2012, there are nearly 2000 games available through Steam,^and 54 million active user accounts. As of January 2013, Steam has seen over 6.6 million concurrent players. Steam has an estimated 50--70% share of the digital distribution market for video games.^<https://en.wikipedia.org/wiki/Steam_(software)#cite_note-Graft-8> ^<https://en.wikipedia.org/wiki/Steam_(software)#cite_note-9> As of January 2013 they have 6.6 million active gamers." unquote

Note the video games part. So your stats of 14% are the measure of either the 54 million active accounts, or the 6.6 million active users, which is it. Now compare that to 1.1 billion computers around the world, a big difference to your percentage running Windows 8 and trying to compare with Linux. Linux desktop is estimated to be around 2% of the worlds computers, again comparable, not left behind by your stats of Windows 8. On the internet alone there are 10 million core major computer systems running Linux, and about 120 million sub servers also running Linux, so your 14% of 54 million compares poorly to your supplied stats.

And I'll quote you again with fact about Linux servers, especially the last paragraph!!!!!!!!!!!!!!

      Servers, mainframes and supercomputers

Servers designed for Linux

Linux distributions <https://en.wikipedia.org/wiki/Linux_distribution> have long been used as server <https://en.wikipedia.org/wiki/Server_(computing)> operating systems, and have risen to prominence in that area; Netcraft <https://en.wikipedia.org/wiki/Netcraft> reported in September 2006 that eight of the ten most reliable internet hosting companies ran Linux distributions on their web servers <https://en.wikipedia.org/wiki/Web_server>. Since June 2008, Linux distributions represented five of the top ten, FreeBSD <https://en.wikipedia.org/wiki/FreeBSD> three of ten, and Microsoft <https://en.wikipedia.org/wiki/Microsoft> two of ten; since February 2010, Linux distributions represented six of the top ten, FreeBSD two of ten, and Microsoft one of ten.^<https://en.wikipedia.org/wiki/Linux#cite_note-80>

Linux distributions are the cornerstone of the LAMP <https://en.wikipedia.org/wiki/LAMP_(software_bundle)> server-software combination (Linux, Apache <https://en.wikipedia.org/wiki/Apache_HTTP_Server>, MySQL <https://en.wikipedia.org/wiki/MySQL>, Perl <https://en.wikipedia.org/wiki/Perl>/PHP <https://en.wikipedia.org/wiki/PHP>/Python <https://en.wikipedia.org/wiki/Python_(programming_language)>) which has achieved popularity among developers, and which is one of the more common platforms for website hosting.^<https://en.wikipedia.org/wiki/Linux#cite_note-SecuritySpace-81>

Linux distributions have become increasingly popular on mainframes in the last decade partly due to pricing and the open-source model. In December 2009, computer giant IBM <https://en.wikipedia.org/wiki/IBM> reported that it would predominantly market and sell mainframe-based Enterprise Linux Server.^<https://en.wikipedia.org/wiki/Linux#cite_note-The_Register-82>

Linux distributions are also commonly used as operating systems <https://en.wikipedia.org/wiki/Operating_system> for supercomputers <https://en.wikipedia.org/wiki/Supercomputer>: since November 2010, out of the top 500 <https://en.wikipedia.org/wiki/TOP500> systems, 459 (91.8%) run a Linux distribution. Linux was also selected as the operating system for the world's most powerful supercomputer, IBM's Sequoia <https://en.wikipedia.org/wiki/IBM_Sequoia> which became operational in 2011.^

Unquote

Andrew Brown

Please don't feed the trolls

Hi Tanstaafl

Yes, a good choice, I forgot about UltraDefrag.

Regards

Andrew Brown

Urmas wrote:

According to http://store.steampowered.com/hwsurvey, Windows 8 is
working on 14% of computers.
It's more than 10 times Linux marketshare.

Lessee now. I have 5 computers here. Only one has Windows on it and it
spends most of it's time running Linux. I have a tablet and a smart
phone. Both run Android (Linux). I have a TV, A/V receiver & Bluray
player, all running Linux. I also have an Asus Eee PC (borrowed by a
friend) that runs Linux and even my WiFi access point runs Linux. Also,
how do you get W8 is on 14% of computers, when the facts show
otherwise? Windows 7 and XP are both used on many more computers. In
fact, computer manufactures were claiming W8 was responsible for much of
the decline in computer sales. Look at how well Nokia is doing since
switching to Windows Phone. They went from industry leader to also ran.

BTW, did you hear the news item recently about how all the computers on
the International Space Station have all been converted to Linux. Or
how just about all the top 500 supercomputers run Linux? Or how most of
the servers on the Internet run Linux? Or...

You might also be interested in reading this article:
http://opensource.com/education/13/7/linux-westcliff-high-school