09 Dec 2017 @ 12:27 PM 

Winamp is a rather old program, and to some people it represents a bygone era- the late 90’s and early 2000’s in particular. However I’ve not found any “modern” software that compares. There is plenty of software- MediaMonkey, MusicBee- etc which attempts to mimic Winamp, or provides the same general capability of managing a local music library, but they either don’t support Winamp Plugins, don’t work properly with many such plugins- or, most importantly, don’t add anything.

Not adding anything is the important one here. At best, I’m getting the same experience as I do with Winamp, so I’m not gaining anything. People ask, “Why don’t you switch” and the default answer is “Why should I?” If the only reason is because what I am currently using is “outdated” and no longer cool, then maybe I should stick with it because we have something in common.

Typically, I’m losing functionality, though. With Winamp I’ve got everything setup largely how I want. More importantly, it  spans not only FLAC and MP3 Music files, but my Music Library also incorporated various Video Game Music formats for various systems, with complete audio libraries for any number of game titles that I can pull up easily. These are native formats which are much smaller  than if those tracks were encoded as MP3 or FLAC and since they are native formats they use Winamp plugins, Which provide additional features for adjusting audio capabilities. These plugins simply don’t exist or don’t work with modern software, so I’d have to relegate those video game music formats to specific, individual players if I was to switch to say “MusicBee” for my local music library.

Nowadays, even the concept of a local Audio Library is practically unheard of. People “Listen to music” by using streaming services or even just via youtube videos, and typically it is all done via a smartphone where storage space tends to be at a greater premium as well. I find that I detest playing music on my Phone (Nexus 6) simply because there is no good software for managing Music saved to the local storage, and it get’s awful battery life if used this way. This is why I use an older 16GB Sony Walkman MP3 player instead; the battery could probably playback for a good continuous 48 hours, and it is much more compact than the phone is. And even if this means an extra piece of “equipment” when I go somewhere, it means that I’m not wasting my phone’s battery life to play music.

Recently, I had the need to do something that is nearly as “outdated” as the program I elected to do it, which is burning an Audio CD. I’ve found this to be the easiest way to transfer music to my Original XBox Console to create custom soundtracks (something which seems to be unique among consoles altogether). So I popped in a CD-RW, opened winamp, clicked on the CD Recorder…. and got a BSOD. DPC_WATCHDOG_VIOLATION.

Well, not that isn’t supposed to happen. After determining it was reproducible, I looked further into it. In particular I found that within the Current Control Set information for my hardware CDROM had an LowerFilters driver specified for PxHlpa64. So, I set about searching what this was.

I found that PxHlpa64 is a Driver by “Sonic Solutions” which is used by some CD Recording software. I couldn’t find any such software that uses it installed, so I merely renamed the affected key and rebooted. The problem went away and everything was as it should be. (And I subsequently wiped out the directory containing the driver file) I suspect that I installed a program previously which used the driver file and the uninstall didn’t remove it for any of a number of reasons.

One of the advantages of having a bit of an idea what is going on with Windows (or any OS really) is that you can more intelligently attempt to solve these sorts of unexpected problems you may encounter. Since I was aware of issues involving Optical drivers and driver “Filter” settings I was able to find and fix the cause of my issues fairly quickly.

Posted By: BC_Programming
Last Edit: 09 Dec 2017 @ 12:27 PM

EmailPermalinkComments (0)
Tags
 30 Oct 2017 @ 6:52 AM 

A couple of weeks ago, I thought it would be neat to get a computer similar to my first PC; which was a 286. I’d actually been considering the prospect for some time, but the prices on a DTK 286 (DTK was the brand I had) were a bit high. However I stumbled on a rather cheap listing for a DTK 286 PC; it wasn’t identical to the one I had but was a similar model, which had a slightly reduced case size but seemed otherwise the same, so I snapped it up.

It arrived a little worse for wear from the journey- the front of the case, which was attached via plastic standoffs screwed into the metal case itself, had all those plasticsnaps come off- However this shouldn’t be too much of a problem as I’m sure I can get it to stay attached for presentation purposes.

When I opened it up to see if anything else had been damaged, I found the network card was out of it’s slot. So I pushed it in. Then I noticed the slot was PCI. 286 systems had 8-bit and 16-bit ISA, so already I knew something was up. That the Processor had a heatsink, and was a Socket 7 meant this was clearly not a 286 system.

Instead, the system ins a Pentium 133 (non-MMX) Socket 7, with 64MB of RAM, a 900MB hard drive, an ATI Mach 64, and 10/10 Ethernet. The Floppy diskette drive wasn’t working correctly so I swapped it for one of my other floppy drives. I also attached one of my CD-RW drives so I could burn data and install programs, to the Windows 95 install that was running on the system.

Pentium133_reduuced

Now, arguably this could be a claim to be made against the seller but I think that it was sold this way by accident; It seems like it is using a specialized industrial motherboard intended to be placed in these sort of Baby AT cases- I don’t think a standard consumer case had Socket 7 and used the large, older Keyboard DIN connector. The motherboard is apparently quite uncommon and more so with the Socket 7 rather than Socket 5. It also has a motherboard Cache “card” installed which doesn’t look to be particularly difficult to find but goes for about half what I paid for the entire unit. The motherboard is unusual in that it seems to be missing things such as Shrouds around the IDE connections as well as having no serial number listed where specified in the center of the board.

My original intent was to fiddle with MS-DOS and Windows 3.1, so realistically this Pentium system could work for that purpose; I have a few older IDE Hard drives I could swap in and set up a dual-boot between MS-DOS/Windows 3.1 and Windows 95. The Mach64 is an older card but is well supported on both Windows 95 and Windows 3.1 as well as MS-DOS, so it seems like a good fit. It only has 1MB of RAM so higher resolutions drop the colour depth- 1024×768 is only doable with 256 color modes, for example- I might want to get some DIP chips to install and upgrade the VRAM, as it has two empty sockets. (Might be cheaper, ironically, to actually get another Mach64 with the chips installed altogether, which is odd) I was also able to add a Creative AudioPCI Card I had lying around without too much hassle; Though there are better options for ideal MS-DOS and Windows 95 audio I might explore later. My main limitation so far is the lack of a PS/2 connector for the mouse and I don’t have a serial Mouse- I found an old InPort Mouse with a Serial Adapter on eBay to serve that purpose, however- As having a mouse would be nice.

One thing I was struck by- much as with things like the iMac G3 I wrote about previously, is that despite being quite old, it still performs rather well with things like Office 97. Basically it just proves my theory that if you fit your software choices to the hardware, old hardware is still quite capable. I could write up documents in Word or create spreadsheets in Excel without too much bother and without really missing anything available on a newer system; and the system would work well with older MS-DOS games as well for most titles- and older titles are facilitated by the Turbo Switch, which oddly doesn’t actually do anything with the button but uses Control-Alt-Minus and Control-Alt-Plus to change the speed and the turbo switch light changes accordingly (it goes between 133Mhz and 25Mhz, making the latter about equivalent to a fast 386).

I might even experiment with connecting it to my network,  Perhaps even try to get Win95 able to work with shared directories from Windows 10 which would be rather funny. (Though I suspect I might need to open up security holes like SMBv1 to get that working….)

Posted By: BC_Programming
Last Edit: 30 Oct 2017 @ 06:52 AM

EmailPermalinkComments (0)
Tags
 07 Aug 2017 @ 6:24 PM 

A while ago, it came out that Microsoft Paint would be deprecated going forward on Windows 10, replaced, instead, with Paint 3D. There have been loads of articles, forum threads, and general griping about this across the Internet. Nonetheless, Paint is hardly the first “casualty” of Windows as it moved forward; nor is it’s loss, realistically, a big one.

A History

“Paint” existed in some form or another dating back to the original Windows release. Like many parts of Windows, it was based on an existing product, but stripped down. In this case Windows Paintbrush was effectively PC Paintbrush 1.05 for Windows but stripped down so as to not compete with the full product.

Windows 1.04

Paint on Windows 1.04

Aside from a smaller set of tools, it appears that another limitation of the included program is that it can only work with monochrome bitmaps. For the time period, that isn’t a surprising limitation though- The Apple Macintosh’s MacDraw program had a similar color limitation.

Windows /286

PAINT running on Windows /286

Windows/286 didn’t change the included PAINT program very much- I wasn’t able to find any significant differences myself, at least. it seems to have the same limitations. I wasn’t able to get Windows /386 to work however I presume PAINT is the same program between them, being that the major difference is enhancements for the 386.

Windows 3.0

Paintbrush running on Windows 3.0

It was with Windows 3.0 that PBRUSH was effectively created. While still seeming to be based largely on PC Paintbrush, the Windows 3.0 version, aside from changing the program title to “Windows Paintbrush” from “PAINT” as well as the executable, also redesigned part of the User Interface. Interestingly, this interface is more similar to the more complete PC Paintbrush product as provided on Windows /286, but of course it did not provide the full toolset of the commercial product either.

Windows 3.1

Paintbrush on Windows 3.1

PBRUSH didn’t see any significant changes from Windows 3.0. It still had a number of annoying limitations that plagued previous releases; in particular, tools couldn’t work with data outside the visible canvas. This meant you couldn’t even paste a screenshot into the program- It would be cropped. You can see this below- this is after performing a floodfill on the outer area of the above, then scrolling down- the exposed canvas was not affected by the operation.

Win 3.1 Paint floodfill failure

Windows 95

MSPaint on Windows 95

Windows 95 saw PBRUSH deprecated in favour of MSPAINT; Not just deprecated, mind you- but altogether removed; however, you could still invoke PBRUSH, due to a new “App paths” feature of Windows. This capability exists to today- Like Win95 there is no PBRUSH.EXE in Windows 10, but running PBRUSH will start MSPaint, as it has since Windows 95. The new Windows 95 version of Paint is now “Microsoft Paint” rather than “Windows Paintbrush” and sports a new executable as well. It also redesigns the interface to adhere to the new “3D” style that Windows 95 introduced, as well as making use of other Windows features that had been enhanced; for example, while you could edit colors in the older Windows Paintbrush, the program used a set of three sliders for that customization. Windows 95 added a new Custom Color dialog, which Microsoft Paint made use of for customizing the palette entries. Thanks to how that dialog worked it meant you could save several custom colors outside of the normal palette and swap between them, too. It also adds a Status bar, which was coming into it’s own with Windows 95 as a convention; This included “tip” text appearing on the left as well as other information appearing in additional panes on the status bar.

Windows 98

MSPaint on Windows 98SE

Windows 98’s release of Microsoft Paint seems to have removed the ability to load and save Custom Colour Palettes. Additionally, it also dropped the ability to save to the .PCX format, while gaining the ability to use certain installed image filters, allowing it to save to .PNG for example, if certain other software is installed.

Windows ME

MSPaint on Windows ME

The Windows ME version of MSPaint appears to be identical to the Windows 98SE Version, however, the executables are not identical- I’m not sure what difference there might be beyond the header indicating it is for a different Windows Version, though. It’s here for completeness.

Windows 2000

MSPaint on Windows 2000

Another entry for completeness as, like Windows ME, I cannot find any differences between it and the Windows 98SE release of MSPaint.

Windows XP

MSPaint on Windows XP

Windows XP introduced a few major revisions to MSPaint. First, it could acquire information from a Scanner or any TWAIN device (Such as a digital Camera). Moreover, it now had native support for JPEG, GIF, TIFF and PNG File formats, without any additional software installs.

Windows Vista

MSPaint running on Windows Vista

The WIndows Vista release of paint changes the default colour palette, has a set of new tool icons, And Reorganizes some of the UI (the Color palette is moved, for example). It changes the undo stack to 10 deep rather than 3, and saves to JPEG by default- which suggests that it was intended or expected largely to be used for acquiring and saving photos.

Windows 7

MSPaint as included in Windows 7.

Windows 7 is another major overhaul of the program, on the same level as the change from the PaintBrush program in Windows 3.1 to MSPaint in Windows 95. This redesigns the interface around the “Ribbon” concept, and adds a number of capabilities, brushes, and a few tools. It also now has anti-aliasing.

Windows 8

This version is pretty much identical to the Windows 7 release; though there are some minor adjustments to the Ribbon.

Future

Microsoft Paint is now deprecated, but this doesn’t prevent you from using it; even when it is removed from the default installation, it will still be made available as a free download from the store. You can also copy/paste a version of paint from a previous Windows 10 install to avoid dealing with an appx container file or any tracking that comes with using the Windows Store, if desired. I think the fuss over this change is a bit of an overreaction. There are plenty of other free programs that can accomplish the same tasks and while it is a bit annoying to have to download them, Windows will still include Paint 3D which should be capable of the same standard tasks people want the older Paint program for, such as screenshots.

The old PBRUSH application running on Windows 10. It’s a Miracle.

What is this witchcraft? Windows NT 3.51 was 32-bit, but was based around Windows 3.1, so it got a 32-bit version of the same old PBRUSH program from Windows 3.1. That can be copied from an NT 3.51 install and run directly on Windows 10. Pretty interesting- Though of arguably limited usefulness, beyond putting it at the end of blog posts to pad out the length for no reason.

Posted By: BC_Programming
Last Edit: 07 Aug 2017 @ 06:24 PM

EmailPermalinkComments (0)
Tags
 05 Apr 2017 @ 11:31 AM 

For a while now Windows 10 has had a “Game Mode” feature. I’m rather mixed on the feature myself, but generally find it strange.

I’ve never been a fan of the “Game Booster” software phenotype; it seems like it is largely snake oil fakery, and where it does have an effect, it is really just a result of the same sort of adjustments that can be made manually via services or other configuration options. Game Mode does have an advantage, here; the first is that it sort of puts those applications “out of business”, and, being built into the OS, it is a much safer implementation, and it’s  goals are less extreme. On the other hand, it does sort of legitimize the concept, which I’ve always found crazy, that such applications are in any way worth using.

I tend not to use the feature, however I can see it having benefits for some users and some systems. To me, overlay features such as the Game Bar that are used in this case feel like a sort of “chaff”; It is better than older approaches like the “Games for Windows Live” featureset, and better implemented as well, but I’ve found that- at least for now- it’s not really for me. This may be partly because I’m not a particularly heavy gamer, though- I seldom play games on my PC- nowhere near what I expected.

I also tend to enjoy older titles. Interestingly, I’ve found many older games- even going back to Win98 Games, run surprisingly well on Windows 10, most issues I’ve encountered with older titles tend to be a result of either lack of 16-bit compatibility (with much older titles) or are a result of the hardware being far in excess of what the game ever expected. A lot of older titles don’t have support for 2560×1440 for example because it is such a high resolution, requiring minor patches. Windows 10 is surprisingly backwards compatible in this regard. Even better than previous Post-Vista Windows releases, including Windows 7 which had an interesting explorer palette realization issue. that tended to cause problems with games that used 256-color modes.

Posted By: BC_Programming
Last Edit: 05 Apr 2017 @ 11:31 AM

EmailPermalinkComments (0)
Tags
Categories: Games, Windows
 26 Mar 2017 @ 4:12 PM 

I’m writing this, right now, on a computer from 1999; a Rev C iMac G3 system that I got off Craigslist for $20. This system has 64MB of memory and a 333Mhz Processor, using Microsoft Word 2001 running under Mac OS 9.2.2.

Considering the advances we’ve seen in tech, you would expect this system to be entirely unusable; and yet here I am using a relatively full-featured productivity application with seemingly the same responsive behaviour and capability as a more modern system.

This is leading inexhorably to a discussion regarding bloat. As computers grow faster, the software that runs on them expands to use the additional capabilities. I’ve had modern text editors that type out what I wrote in slow motion- updating a character every second or so, on a 4Ghz quad core system with 32GB of Memory that isn’t otherwise being taxed. There is very little excuse for this, and yet it happens.

As computers moved forward we find that extra capability is, oftentimes, absorbed by developers. That is, a faster Processor means they can get the same speed by using C instead of Assembly, or they can write the software in a higher-level Language like Java or C# instead of writing it in C. Those are entirely reasonable as, in a way, they do eventually reduce the cost of software to the consumer. Nowadays that question is in regards to web applications. We have many pieces of software that are themselves written in Javascript for example, which put a heavy load on the interpreter under which they run. As an interpreted language the performance is reduced even further, but it is considered acceptable because faster systems are the norm and your typical system is capable of running that at speed.

But in many respects, we’ve been upgrading our hardware but running in place. While many aspects have certainly improved- entertainment/game software, for example, has higher resolutions, more colours, higher-res textures and more polygons than ever before – a lot of otherwise basic tasks have not greatly improved.

But at the same time one is often left wondering exactly what features we have gained in this inexhorable forward march. As I type this the system I’m writing on is not connected to the  Internet; I’m not receiving any notifications or tips, I’m not seeing advertisements for Cloud storage or nag screens about installing the latest update to the system software. I can install applications and use them, and in many cases I can use them with a lot of the same effectiveness and even performance as corresponding modern software. Accounting for the exception, of course, of web browsers.

Web browsers are an interesting case in looking how computing has changed.  Your typical system from the late nineties would have had perhaps 64MB of RAM, like the iMac G3 I’m using right now. I can run Internet Explorer and open local web pages (I’m too lazy to move the system to connect it via Ethernet, since it naturally has no Wireless capabilities on it’s own), and Internet Explorer only consumes 10MB of memory. Compared to my main desktop system, the proportions are similar- oftentimes I find Firefox or Chrome consuming upwards of 1GB of Memory! It is easy to blame this on software bloat- that browsers have merely started using more memory because they don’t need to not- obviously a web browser using upwards of 1GB of memory couldn’t have existed at all in 1999, but it runs without issue on most modern systems, particularly now that 4GB is becoming a “bare minimum” to run a system with. Blaming it on that would be an oversimplification, as the task of browsing has ballooned from the time period where it could be done with that much RAM; now browsers need to not only support more complicated HTML structures as well as features such as Stylesheets, but they are effectively becoming a platform on their own, with web applications running inside of browsers. To make a contemporary reference- it’s like a Hypercard stack, to put it in terms relative to the older Mac systems I’ve been fiddling with lately. As a result, saying it is entirely due to bloat would certainly be unfair.

Perhaps, then, we need a better benchmark for comparison. I’m writing this in Microsoft Word 2001 for Mac, so perhaps Microsoft Word is a better comparison? As I write this, Microsoft Word is using 10MB of Memory. Launching Microsoft Word 2013, and opening a blank document, I find that, to begin with, it is using 55MB of memory.

Now, compared to the total amount of memory on each system, Word 2013 is actually using a much smaller percent; 10MB is about %15 of the total memory on this Mac, but 55MB is only 0.32% of the total 16GB of memory on the laptop I started Word 2013 on; so in that sense I suppose we could argue that memory usage of applications has reduced compared to the available hardware. But of course in absolute terms the story told is different, and a blank document is using over 5 times as much memory as it takes for this older release on an older computer to maintain and display a multiple-page document.

There are a number of reasons for this sort of difference. For one thing, excessive memory usage by certain components might not come up in testing on more recent machines; as long as it runs, excess memory usage might not be detected, and even if 55MB is higher than it is on this older system, as established, the smaller usage of total physical memory on most any modern system is going to result in it not being considered an issue. Another reason is that sometimes with additional capabilities, Software gets added effects. Features like Aero Glass and the drawing of features like the modern Office Ribbon, for example. Also to be considered are modern features like font smoothing, which were less prevalent and advanced in 1999.

Nonetheless, it is still somewhat humourous that a basic word processor has managed to start using that much more memory for what is effectively the same task! The actual word processing capabilities are largely equivalent between the two releases of the software, which is not something we can argue with browsers.

Perhaps it is not something that is too much of a problem. In many respects, it would seem that application needs eventually dictate what people consider a “bare minimum” of RAM, meanwhile, we can see many different productivity tasks remained largely the same and contain similar feature sets and capabilities as those requirements rise. Early versions of Microsoft Word or Excel, for example, generally contain the bulk of features that people make use of in the latest release of the software, while using a relatively infinitesimal amount of system memory in doing so. This does lead to what I find cringeworthy proclamations such as “How can you possibly do anything with only 2GB of Memory?” which make sense in a certain context but when applied broadly can be pretty silly; We managed to do many of the same things we are doing nowadays with desktop computers 20 or 30 years ago, with far less memory and processing power, after all. Additionally one could easily imagine bringing somebody from, say, 1994 forward in time to hear such a statement and have them be in awe at how such an unimaginably large amount of memory – an amount still unheard of even for Hard Disk sizes to them – was being dismissed as far too little RAM for many of the same sort of tasks they had performed without issue.

Personally, I’ve found popping onto an old computer- like this iMac G3, to be a helpful experience. These are computers that, many years ago, were top of the line, the sort of systems some people would drool over. And now they are relegated to $20 Craigslist ads, which are the only thing between them and the dump. Meanwhile, the Operating Systems are not only responsive but are designed in such a way that they are quite easy to use and even, dare I say it, fun to use! Mac OS 9.2.2 has little audio flairs and clicks and pops and swoosh sounds associated with UI interactions that actually had me missing it when using my more recent systems. Which is not to suggest that I think it wouldn’t become annoying fairly quickly with regular usage.

Unfortunately — or I suppose in some ways, fortunately — the systems are relics of a bygone era. Computers are commonplace enough that we have them in a form that we can keep in our pocket. We have become so accustomed to the devices that they are now a part of daily life, as are the networked components, perhaps even more so. People are constantly updating their Facebook feed, checking other people’s posts, reading their Twitter feeds or their Instagram, sending people text messages, arguing about politics with some brother of a friend of a friend of a friend on Facebook who they’ve never met, etc. We are living in what could effectively be described as a “Fantasy World” by people in the 90’s and yet here we are living it everyday to the point where it is not only mundane, but where things considered unimaginable conveniences only a decade ago are now unacceptable.

This is not intended to suggest we should not strive for or even demand progress, just that maybe we should lay off the hyperbole in describing how lack of such progress is detrimental to our life. A web portal missing a minor convenience feature is not something to throw a fuss over; software beign released at a price point you disagree with is not a reason to go on the warpath against it’s developer, and just because you have a Tumblr blog with 5 readers doesn’t make you a social media influencer that developers- or anybody, for that matter- should cater to in any way.

There is an argument that something ineffable has been lost with the rise and ubiquity of the internet that is very much beyond description. While nowadays “research” is loading up Google in a new tab and running a few searches, it used to consist of going to the library and looking up Index cards and reference material. Where dealing with say a Programming conundrum or trying to use a program feature you weren’t familiar with meant looking it up in the hard-copy manual, or in the former case, actually working on your own solution based on what you needed, now you just Google for it and go directly to the answer on sites like Stack Overflow- you copy paste the code function and use it, or you mindlessly follow the steps outlined to use the program feature you want. Neither way is of course better than the other, it’s just that the Internet really is the ultimate Enabler.

I have about a half dozen books that I’ve barely even cracked open that, if I had them a decade ago, I would have read cover to cover several times over by now. I’ve had Project ideas squashed before I even started them by a quick Google search to find out that there are already programs that performed the function and they did it better than I even imagined. Whereas before I would have pursued such projects anyway- not knowing there was anything already done, and end up learning as a result.

As much as the ubiquity of the Internet has helped us, it has also acted as the ever-present enabler to our addictions. It feeds our addiction to information, our addiction to instant gratification, and our ever-present curiousity, but it does so with what could almost be described as empty calories. It’s like eating hard candies when you are hungry. So it leaves many unsatisfied and seeking more, wondering what is missing.

It was the hunt for the information, the trail that you blazed while doing research or cross-referencing in a Dewey-decimal index. It was the excitement of finding a nice thick book on a subject you were interested in, knowing it would keep your information “mouth” fed for weeks to come as you read and reread it, sucking in all the information it had to offer. Even the bits you couldn’t use. I read “Applied Structured BASIC” from cover to cover multiple times despite it covering ancient BASIC dialects that had long since stopped mattering.

Now, I find that there is a place for the phrase “information overload”. No bookshelf, no matter how full, can possibly compete with the World wide web in terms of the ease of access to information, the accuracy and inaccuracy of that information, as well as the sheer amount of that information, to the point where one could almost argue there is too much. Perhaps the skill in using the Internet for information is having a proper “tunnel vision” to get the information you want and ‘get out’ when you are looking for something specific. The alternative, of course, is to go looking up how to create a Jump List in Windows 7 and later and suddenly finding yourself reading about how Strawberries are harvested.

Posted By: BC_Programming
Last Edit: 26 Mar 2017 @ 04:12 PM

EmailPermalinkComments (0)
Tags
Categories: General Computing
 19 Mar 2017 @ 4:42 PM 

With older CRT screens, “Burn in” was when the phosphor coating was effectively “burned off” by the scanning beam. There were many examples of older computer displays that were permanently displaying the WordStart Menu or Lotus 1-2-3 Menus, for example, because people would have them open so long.

This problem was “solved” by LCD displays. But as it happens there is a similar symptom that can occur with IPS displays, it’s just not typically quite as permanent.

A few years ago I got a 2560×1440 IPS Monitor to replace the 1440×900 TN panel I was using. It has served me well, however over time I’ve noticed a “darkening” near  the top of the screen. It never matched up with images I would display- my browser or Visual Studio- for extended periods of time, so I figured it was a minor defect. It was not particularly noticable either.

More recently, though, I was able to clearly make out many Firefox icons in precisely the location they are when the Firefox Window is maximized on that screen. This is because on a few occasions I decided not to turn that screen off overnight. The image persists apparently because of electrical changes in the actual cells, which effectively result in it letting less backlight through at the same voltage level.

I’ve already ordered a new monitor, but I’ve had some success at least diminishing the afterimage by leaving a full white screen overnight a few times. (Just a quick Windows Forms program). I’ve also taken to not having Firefox maximized but instead expanded manually to fill the screen but shrunk downwards a bit so the same elements aren’t in the location that was “persisted”.

It’s interesting that the afflictions of yesterday’s technology that were purported to have been solved have at least in some way remained with us. While it allegedly can be fixed (with either a full white or full black screen, depending who you ask), it is certainly something worth avoiding to begin with. I certainly found the behaviour unexpected, thus making this quick little post on the topic.

Posted By: BC_Programming
Last Edit: 19 Mar 2017 @ 04:42 PM

EmailPermalinkComments (0)
Tags
 17 Mar 2017 @ 9:59 PM 

There has been some concern and even repudiation about Microsoft’s decision to not provide updates to Windows 7, 8, and 8.1 when run on hardware using a newer processor, such as the Intel Kaby Lake processors. This has been claimed by some as a marketing move to try to “force” users to use Windows 10.

Now, I’m not the greatest fan of some of the things introduced with Windows 10. At the same time, I have no modern systems- other than Virtual Machines- not running either Linux or Windows 10. So it’s more an annoyance at how much one has to do to appropriately assert one’s desired options with Windows 10.

Windows 7 and 8/8.1 have continued to be supported as per the Windows lifecycle; the change is for hardware that was literally introduced after the end of mainstream support for both operating systems. Extended support only applies to Security updates; however, supporting security updates on Windows 7 and 8/8.1 with those Processors would mean supporting the processor. The issue there is that while the newer chips likely run the same way as older chips did with the same code, there is no guarantee of that, and it would still require the software to be tested and bugfixed specifically for those newer chips, which means effectively, supporting the new processors.

The Updates cannot go out on an “as is” basis to systems with the new processors because hten any problems will incur support costs and bugfixes to those updates that will also effectively mean supporting the new processors on the older software.

Worth noting is that this doesn’t lock out enterprising users who are willing to take the risk that their entire Win7/Win8/8.1 system will stop functioning due to said updates. One can still workaround this, it just requires you to step off the beaten path even further, making it much more clear and far “safer” for Microsoft to tell you to basically piss off if you try to get support.

It’s likely this approach may have been adopted to try to prevent another repeat of the Windows XP diehards. Mind you, it hasn’t worked so far; Many people are now Windows 7 diehards to much the same capacity. But at least- from Microsoft’s perspective- they won’t be financing it.

Posted By: BC_Programming
Last Edit: 17 Mar 2017 @ 09:59 PM

EmailPermalinkComments (0)
Tags
 24 Nov 2016 @ 10:03 PM 

There is a seemingly common affliction affecting some users of Windows where they find that their desktop icons receive old-style focus rectangles. This seems to affect Windows Vista and later.

Dotted Focus Rectangle.

After some investigation, I found the cause to be an Accessibility setting. inside Ease of Access in Control Panel, There is a “Change how the keyboard works” option. This option takes you to another page with “Underline keyboard shortcuts and access keys”. When this option is checked, Keyboard cues are enabled. This includes the underlined text of menus and buttons- but it also includes ListView Focus Rectangles, which means with the option enabled there is a Focus rectangle shown on the desktop rather frequently.

To change this setting, toggle it and reboot.

Posted By: BC_Programming
Last Edit: 24 Nov 2016 @ 10:03 PM

EmailPermalinkComments (0)
Tags
 09 Nov 2016 @ 8:30 AM 

I’ve previously written about making adjustments to the Windows Master Volume control programmatically. I alluded to the addition of possible other features such as being able to view the volume levels of other applications. I’ve gone ahead and made those changes.

The first thing to reiterate is that this makes use of a low-level .NET Wrapper for the Windows Core Audio API. This can be found here.

The first thing I decided to define was an object to represent a single Applications Volume Session info/properties. In addition, it will be provided a reference to the IAudioSessionControl interface representing that application’s Audio session, so it can be directly manipulated by adjusting the properties of the class.

Next, we need to declare a COM import, the Multimedia Device enumerator. Specifically, we need to import the class, as the Vannatech Library only provides interfaces, which we cannot instantiate:

Now that we have a starting point, we can create an enumerator method that retrieves all active audio sessions as “ApplicationVolumeInformation” instances:

A github repository with a more… complete… implementation of a working Console program can be found here.

Posted By: BC_Programming
Last Edit: 11 Nov 2016 @ 12:29 PM

EmailPermalinkComments (0)
Tags
Categories: .NET, C#, Programming, Windows
 28 Aug 2016 @ 1:08 AM 

As anybody knows, there can be a lot of incorrect information on the internet. Internet “Just so” stories can spread like wildfire if they are believable and explain something neatly. One of those “just so” stories involves older game consoles and computers; over time, we find that our once-white and gray plastics on old systems like Apple II’s, NES consoles, SNES consoles, and so on change colour; they change from white or gray to yellow, and over time that yellow darkens, sometimes even turning brown.

This phenomena is “explained” here. Or is it? Does what is stated there about the process reflect reality? Does it make chemical sense? To the layman or casual observer- hey, it makes sense. Bromine IS brown, after all, it’s added to the plastic. But is there a chemical basis and support for it? What reactions actually take place?

“RetroBright”- which is basically just Hydrogen peroxide – is commonly recommended to “reverse” the effects. The reason I care about the actual chemical properties is because the yellowing itself goin g away isn’t an indication that everything is back to how it was. Colour changes can be the result of all sorts of things. More importantly, if we learn the actual chemical processes involved, perhaps we can come up with alternative approaches.

Basically, the story put forth in the article is a rather commonly repeated myth- a Chemical “just-so” story of sorts- “Bromine is brown so that must be it” Is the extent of the intellectual discussion regarding chemistry, more or less. Generally though there isn’t much drive to look further into it- it all makes sense to the layman on the surface, or even one with rather standard chemistry knowledge. But when you look deeper than the surface of the concept- you see that the commonly held belief that Brominated Flame Retardants are responsible doesn’t seem to hold up.

First we can start with the first inaccuracy in that link- Bromine is not added as a flame retardant- that is flat out, categorically and completely wrong, and trivially easy to refute. Bromine compounds are added as flame retardants, But as they are compounds, the colour of elemental Bromine (brown) is irrelevant, because elemental Bromine is not added to the plastic. Specifically, chemicals like Tetrabromobisphenol A. (C15H12Br4O2).

The article also says that “The problem is that bromine undergoes a reaction when exposed to ultraviolet (UV) radiation” But Bromine doesn’t photo-oxidize. It doesn’t even react with anything in the air on it’s own; creating Bromine dioxide either involves exposing it to Ozone at very low temperatures alongside trichlorofluoromethane, alternatively, gaseous bromine can be made to react with oxygen by passing a current through it. Neither of these seem like they take place in a Super Nintendo. Not to mention elemental bromine is brown, so if it was in the plastic, oxidization would change it from the brown of elemental bromine to the yellow of bromine dioxide.

Back to what IS in the plastic, though- Tetrabromobisphenol A is not photosensitive; it won’t react with oxygen in the air due to UV light exposure, and the bromine cannot be “freed” from the compound and made elemental through some coincidence in a typical environment. It is simply not the cause of the yellowing; (it will yellow without BFR’s as well, which sort of indicates it’s probably not involved).

The Yellowing is inherent to ABS plastics, because it is the ABS plastic itself that is photo-oxidative. On exposure to UV light (or heat, which is why it can happen with systems stored in attics for example), the butadiene portion of the polymer chain will react with oxygen and form carbonyl-b. That compound is brown. There’s your culprit right there. Retrobright works because thsoe carbonyls react with hydrogen peroxide, and create another compound which is colourless. but the butadiene portion of the polymer remains weak- oxalic acid is thought to be one possible way to reverse the original reaction.

So why does it sometimes not affect certain parts of the plastic or certain systems? here the “just so” story is a bit closer to reality- the “story” is that the plastic formulae has different amounts of brominated flame retardants, This is probably true, but as that compound isn’t photo-reactive or involved in the chemical process, it’s not what matters here. What causes the difference is a variance in a different part of the formulae- the UV stabiliser.

UV Stabilisers are added to pretty much All ABS plastic intentionally to try to offset the butadiene reaction and the yellowing effect the resulting carbonyl has. They absorb UV light and dissipate it as infrared wavelength energy which doesn’t catalyze a reaction in the butadiene. Less UV Stabilizer means more UV gets to the Butadiene and causes a reaction and the plastic yellows more quickly. more UV stabilizer means less UV catalyzes reactions and the plastic takes longer to change colour.

As with anything related to this- the best way is to experiment. I’ve decided to pick up some supplies and test both approaches on a single piece of plastic. some standard “retrobright” mixture using hydrogen peroxide, and a variation using oxalic acid. I can apply both to the same piece of yellowed plastic, and observe the results. Are both effective at removing the yellowing color? what happens longer term? It should be an interesting experiment.

Posted By: BC_Programming
Last Edit: 17 Sep 2017 @ 05:00 PM

EmailPermalinkComments (0)
Tags

 Last 50 Posts
 Back
Change Theme...
  • Users » 45017
  • Posts/Pages » 369
  • Comments » 105
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight

PP



    No Child Pages.

Windows optimization tips



    No Child Pages.

Software Picks



    No Child Pages.