02 Feb 2018 @ 5:50 PM 

CPU architectures and referring to them sit in a sort of middle ground. They need to be technically accurate but over time their terminology can change. I thought it would be interesting to look into the name origins of the two most widely used CPU architectures for desktop systems today. And, admittedly this is fresh on my mind from some research I was doing.

x86

Nowadays, most 32-bit software for desktops and laptops is referred to as being  built for “x86”. What does this mean, exactly? Well, as expected, we have to go back quite a ways. After the 8086, Intel released the 80186, 80286, and 80386. The common architectures and instructions behind these CPUs came to be known as “80×86 instructions”- understandably. The 486 that followed the 80386 dropped the 80 officially from the name- inspection tools would “imply” it’s existence but Intel never truly called their 486 CPUs “80486”. It’s possible this is how the 80 got dropped. Another theory could be that it was simply dropped for convenience- x86 was enough to identify what was being referenced, after all.

The term survived up to today, even though, starting with the Pentium, the processors themselves never truly bore the “mark” of an x86 processor.

x64

x64 is slightly more interesting in it’s origins. 64-bit computing had existed on other architectures before but x64 now references the “typical” x86-compatible 64-bit operating mode. Intel’s first foray into this field was with the Itanium processor. This is a 64-bit processor and it’s instruction set is called “IA-64” (as in Intel-Architecture 64”). This did not work well as it was not directly compatible with x86 and therefore required software emulation.

it was AMD who extended the existing x86 instruction set to add support for 64-bit through a new operating mode. Much as 32-bit instructions were added to the 80386 and compatibility preserved by adding a new “operating mode” to the CPU, the same was done here; 64-bit operations would be exclusive to the 64-bit Long Protected Mode, where 32-bit was still 32-bit protected mode, and the CPU was still compatible with real mode.

This AMD architecture was called “AMD64” and the underlying architecture that it implemented was “x86-64”.

Intel, as part of a series of settlements, licensed AMD’s new architecture and implemented x86-64. This implementation went through a few names- x86E, EM64T- but Intel eventually settled on Intel64. Intel64 and AMD64 aren’t identical, so software targets a subset- this subset is where we get  the name x64.

Posted By: BC_Programming
Last Edit: 02 Feb 2018 @ 05:50 PM

EmailPermalinkComments (0)
Tags
Tags: ,
Categories: Hardware

 30 Jan 2018 @ 9:02 PM 

Software and computer security has always been a rather important topic. As our systems become more interdependent and connected- and we expose ourselves and our important information more and more, it is becoming even more important. Operating System and Software updates are issued to address security problems, and these security problems are given the utmost importance as users are urged to install them as soon as possible. Many Operating Systems- such as Windows 10, disable or restrict the ability to prevent updates (it seems to require Pro to adjust the settings to update only when the user initiates it, for example). This is considered by many to be a positive change; the idea being that this will prevent systems from being compromised through those security exploits.

And, certainly, that is true. Installing security patches will, obviously, prevent the exploits that they resolve from being exploited for malicious purposes. However I think the impact that those exploits have in terms of your typical end user have been overstated.

Based on my own experiences, I an animate that the vast majority of end-user malware infections are not perpetuated or contributed to in any notable way by the sort of issues resolved by security updates. Those updates are more applicable to servers, data centers, and corporate environments. For end-user PCs, it is seldom heard of to find a malware infection that was not caused in some way by trojan horse malware — something which the user explicitly downloaded and ran themselves which had the unintended side effect of releasing malware onto their system. Pirated software; game mods, “keygens”, and so on and so forth. Screensavers, greeting card executables, applications that disguise as images. Something as seemingly innocuous as an aftermarket Windows Theme could very easily contain an unwanted payload, and it won’t matter if the system is fully up to date or not if you allow it to install.

The Security Circus

I call the general concept of overstating those concerns the “security circus”. It infects certain styles of thinking and sort of makes itself a self-perpetuating concept over time. As an example scenario, a user may come to an IT repairperson with issues on their PC; it may be determined that those issues are caused by malware. the “security circus” contribution to this scenario could be that the repair person discovers that the system is out of date and missing a number of critical security updates. Because they have learned, over time, that security updates are critical and prevent infections, they may — and very often do — assume that the malware made it’s way onto the PC via that infection vector. Over time these occurrences pile up and that particular IT staff can state, without any intention of lying, that they have seen plenty of systems that were compromised by vulnerabilities, even though, realistically, they don’t actually know if the vulnerabilities were even responsible.

The “Acts” of this security circus seem to largely stem around Fear, Uncertainty, and Doubt being spread and manipulated. Coincidentally, I notice that oftentimes these efforts work in favour of a number of corporate interests. Forced OS Updates for example benefit the OS manufacturer, particularly as updates may very well provide any number of other pieces of software which provide “diagnostic” information which can be used by that company for marketing efforts. Security updates benefit security firms and security software vendors, who’s products are used to “prevent” the problems that are caused until they receive that critical patch to fix the issue, or who release security “scanners” which analyze and report whether a system is susceptible to the vulnerability.

Some recent security scares come to mind when I think about the “security circus”.

Wannacry

The Wannacry ransomware provides some good examples of the operation of this “security circus”. Articles and postings on the issue are often decidedly vague about the extent of the vulnerability that causes it, and often overstate it’s capability; users are urged to update as soon as possible, and in some cases I’ve seen it argued that the vulnerability allows the malware to be installed over the Internet.

The reality, however, is that Wannacry had a distribution method that could exploit a vulnerability in SMBv1 within a LAN in order to spread to another system that was accessible on a LAN from the infected system. This means that a network that has systems that is vulnerable will have those vulnerable systems spread the infection if one get’s infected, however, that “patient zero” cannot be infected remotely. Wannacry would still only be installed and infect a “patient zero” LAN system through some other infection vector. and that infection vector was almost certainly through trojan-horse malware of some description.

Which is not, of course, to understate that that is certainly a concern. If Little Jimmy runs an infected game mod installer, and their system get’s infected, Other vulnerable computers on the same network would eventually be compromised. However, I think the critical thing is not the security updates, but, in that scenario, the user education to avoid installing malicious software to begin with. In the scenario, for example, Why did Little Jimmy Trust the game mod installer? Should Little Jimmy even have user permissions to install software? What sort of education can be provided to allow users that have “vulnerable” habits to adjust those habits to avoid problems? Installing security Updates, Security software, firewall’s etc is, IMO, largely a way of avoiding that question, and unfortunately it pairs poorly because a user with “vulnerable” habits is often the sort who will happily disable their Anti-virus when say a game modification installer says it is a “false positive”, or who will happily give administrator permissions — if they can — to applications based in promised functionality.

Game “cheat” software often takes that approach, making a promise and then requesting elevation with that promise in mind is enough to convince some users to “take a chance”; thewse same “vulnerable” users are also susceptible to phishing scams or other things such as software programs stealing login information from say online accounts. A specific example of that would be for example simple applications which claim to “give op” to a Minecraft player. All you need to do is give your username and password! But of course it does not give you OP. Instead it simply E-mails your login information to a specified account. It doesn’t work, the user deletes the program, but perhaps never thinks about changing their login information because as far as they know the program just “didn’t work”. Until one day they cannot log in. Or, For MMOs, they suddenly find their character is poorly equipped or perhaps banned for activity because it was used for some nefarious in-game activity.

Speaking for myself, aside from the occasional Malwarebytes scan, I don’t run any sort of background AV software or firewall. In fact, I disable the built-in Windows software that provides those features. To my knowledge, I’ve not been infected in over 10 years.  And even then, what I was infected with, Virut/Sality, wasn’t being picked up by any Security software, even fully updated. Since then I’ve had systems that have lacked important security updates magically not be infected in the ways that the aforementioned “security circus” would have me believe. It seems — at least from where I am standing — that the implications of security vulnerabilities to your typical end-user system are vastly overstated, and the focus on those as a means to prevent people from getting infected may be a focus in the wrong area. Instead, Users should receive education such that their “vulnerable” habits can be eliminated or at the very least they can be made aware of them. Better education for computer systems in general can help as well; knowing the difference between an svchost.exe where it should be and an svchost.exe where it isn’t can make it possible to identify unwanted software even if any installed security software isn’t picking it up.

Spectre/Meltdown

Another topic of some interest that has taken the security world by storm is the Meltdown and Spectre prefetch cache security problems found in many Microprocessors (Meltdown being specific to Intel chips). These security concerns relate to an ability to access memory that would be otherwise unavailable by exploiting speculative cache behaviour. Meltdown involves writing carefully crafted machine language instructions in order to trick the speculative execution into providing access to small pockets of memory that would not otherwise be accessible; there are kernel-mode pages that are part of that applications virtual address space. Spectre functions similarly but requires those carefully crafted machine code instructions in order to try to perform various memory operations and carefully measure them in order to guess at the data found in certain areas of those kernel-mode pages within that processes virtual address space.

I feel, again, that the security circus has somewhat overstated the dangers involved with these security problems; in particular, it is too common to see statements that t his could be exploited through web-based code, such as Javascript, which itself would require escaping the Javascript sandbox which has wider security implications anyway. Additionally, it seems ot presume that this could be used to steal web or system passwords, when realistically it will only enable viewing tiny pockets of driver-related process memory. and things like ASLR could very easily mitigate any directed attack looking for specific data.

But, the reality hardly sells articles, and certainly doesn’t sell security software- which, I might add, by sheer coincidence tends to be either a sponsor or major advertiser for many of the wider publicized security publications. Coincidence, no doubt.

Posted By: BC_Programming
Last Edit: 31 Jan 2018 @ 07:51 PM

EmailPermalinkComments (0)
Tags

 09 Dec 2017 @ 12:27 PM 

Winamp is a rather old program, and to some people it represents a bygone era- the late 90’s and early 2000’s in particular. However I’ve not found any “modern” software that compares. There is plenty of software- MediaMonkey, MusicBee- etc which attempts to mimic Winamp, or provides the same general capability of managing a local music library, but they either don’t support Winamp Plugins, don’t work properly with many such plugins- or, most importantly, don’t add anything.

Not adding anything is the important one here. At best, I’m getting the same experience as I do with Winamp, so I’m not gaining anything. People ask, “Why don’t you switch” and the default answer is “Why should I?” If the only reason is because what I am currently using is “outdated” and no longer cool, then maybe I should stick with it because we have something in common.

Typically, I’m losing functionality, though. With Winamp I’ve got everything setup largely how I want. More importantly, it  spans not only FLAC and MP3 Music files, but my Music Library also incorporated various Video Game Music formats for various systems, with complete audio libraries for any number of game titles that I can pull up easily. These are native formats which are much smaller  than if those tracks were encoded as MP3 or FLAC and since they are native formats they use Winamp plugins, Which provide additional features for adjusting audio capabilities. These plugins simply don’t exist or don’t work with modern software, so I’d have to relegate those video game music formats to specific, individual players if I was to switch to say “MusicBee” for my local music library.

Nowadays, even the concept of a local Audio Library is practically unheard of. People “Listen to music” by using streaming services or even just via youtube videos, and typically it is all done via a smartphone where storage space tends to be at a greater premium as well. I find that I detest playing music on my Phone (Nexus 6) simply because there is no good software for managing Music saved to the local storage, and it get’s awful battery life if used this way. This is why I use an older 16GB Sony Walkman MP3 player instead; the battery could probably playback for a good continuous 48 hours, and it is much more compact than the phone is. And even if this means an extra piece of “equipment” when I go somewhere, it means that I’m not wasting my phone’s battery life to play music.

Recently, I had the need to do something that is nearly as “outdated” as the program I elected to do it, which is burning an Audio CD. I’ve found this to be the easiest way to transfer music to my Original XBox Console to create custom soundtracks (something which seems to be unique among consoles altogether). So I popped in a CD-RW, opened winamp, clicked on the CD Recorder…. and got a BSOD. DPC_WATCHDOG_VIOLATION.

Well, not that isn’t supposed to happen. After determining it was reproducible, I looked further into it. In particular I found that within the Current Control Set information for my hardware CDROM had an LowerFilters driver specified for PxHlpa64. So, I set about searching what this was.

I found that PxHlpa64 is a Driver by “Sonic Solutions” which is used by some CD Recording software. I couldn’t find any such software that uses it installed, so I merely renamed the affected key and rebooted. The problem went away and everything was as it should be. (And I subsequently wiped out the directory containing the driver file) I suspect that I installed a program previously which used the driver file and the uninstall didn’t remove it for any of a number of reasons.

One of the advantages of having a bit of an idea what is going on with Windows (or any OS really) is that you can more intelligently attempt to solve these sorts of unexpected problems you may encounter. Since I was aware of issues involving Optical drivers and driver “Filter” settings I was able to find and fix the cause of my issues fairly quickly.

Posted By: BC_Programming
Last Edit: 09 Dec 2017 @ 12:27 PM

EmailPermalinkComments (0)
Tags

 25 Nov 2017 @ 7:11 PM 

The code for this post can be found in this github project.

Occasionally you may present an interface which allows the user to select a subset of specific items. You may have a setting which allows the user to configure for example a set of plugins, turning on or off certain plugins or features.

At the same time it may be desirable to present an abbreviated notation for those items. As an example, if you were presenting a selection from alphabetic characters, you may want to present them as a series of ranges; if you had A,B,C,D,E, and Q selected, for example, you may want to show it as “A-E,Q”.

The first step, then, would be to define a range. We can then take appropriate inputs, generate a list of ranges, and then convert that list of ranges into a string expression to provide the output we are looking for.

For flexibility we would like many aspects to be adjustable, in particular, it would be nice to be able to adjust the formatting of each range based on other information, so rather than hard-coding an appropriate ToString() routine, we’ll have it call a custom function.

Pretty straightforward- a starting point, an ending point, and some string formatting. Now, one might wonder about the lack of an IComparable constraint on the type parameter. That would make sense for certain types of data being collated but in some cases the “data” doesn’t have a type-specific succession.

Now, we need to write a routine that will return an enumeration of these ranges given a list of all the items and a list of the selected items. This, too, is relatively straightforward. Instead of a routine this could also be encapsulated as a separate class with member variables to customize the formatted output. As with any programming problem, there are many ways to do things, and the trick is finding the right balance, and in some cases a structured approach to a problem can be suitable.

Sometimes you might not have a full list of the items in question, but you might be able to indicate what an item is followed by. For this, I constructed a separate routine with a similar structure which instead uses a function to callback and determine the item that follows another. For integer types we can just add 1, for example.

This is expanded in the github project I linked above, which also features a number of other helper routines for a few primitive types as well as example usage. In particular, the most useful “helper” is the routine that simply joins the results of these functions into a resulting string:

Posted By: BC_Programming
Last Edit: 25 Nov 2017 @ 07:11 PM

EmailPermalinkComments (0)
Tags
Categories: .NET, C#, Programming

 30 Oct 2017 @ 6:52 AM 

A couple of weeks ago, I thought it would be neat to get a computer similar to my first PC; which was a 286. I’d actually been considering the prospect for some time, but the prices on a DTK 286 (DTK was the brand I had) were a bit high. However I stumbled on a rather cheap listing for a DTK 286 PC; it wasn’t identical to the one I had but was a similar model, which had a slightly reduced case size but seemed otherwise the same, so I snapped it up.

It arrived a little worse for wear from the journey- the front of the case, which was attached via plastic standoffs screwed into the metal case itself, had all those plasticsnaps come off- However this shouldn’t be too much of a problem as I’m sure I can get it to stay attached for presentation purposes.

When I opened it up to see if anything else had been damaged, I found the network card was out of it’s slot. So I pushed it in. Then I noticed the slot was PCI. 286 systems had 8-bit and 16-bit ISA, so already I knew something was up. That the Processor had a heatsink, and was a Socket 7 meant this was clearly not a 286 system.

Instead, the system ins a Pentium 133 (non-MMX) Socket 7, with 64MB of RAM, a 900MB hard drive, an ATI Mach 64, and 10/10 Ethernet. The Floppy diskette drive wasn’t working correctly so I swapped it for one of my other floppy drives. I also attached one of my CD-RW drives so I could burn data and install programs, to the Windows 95 install that was running on the system.

Pentium133_reduuced

Now, arguably this could be a claim to be made against the seller but I think that it was sold this way by accident; It seems like it is using a specialized industrial motherboard intended to be placed in these sort of Baby AT cases- I don’t think a standard consumer case had Socket 7 and used the large, older Keyboard DIN connector. The motherboard is apparently quite uncommon and more so with the Socket 7 rather than Socket 5. It also has a motherboard Cache “card” installed which doesn’t look to be particularly difficult to find but goes for about half what I paid for the entire unit. The motherboard is unusual in that it seems to be missing things such as Shrouds around the IDE connections as well as having no serial number listed where specified in the center of the board.

My original intent was to fiddle with MS-DOS and Windows 3.1, so realistically this Pentium system could work for that purpose; I have a few older IDE Hard drives I could swap in and set up a dual-boot between MS-DOS/Windows 3.1 and Windows 95. The Mach64 is an older card but is well supported on both Windows 95 and Windows 3.1 as well as MS-DOS, so it seems like a good fit. It only has 1MB of RAM so higher resolutions drop the colour depth- 1024×768 is only doable with 256 color modes, for example- I might want to get some DIP chips to install and upgrade the VRAM, as it has two empty sockets. (Might be cheaper, ironically, to actually get another Mach64 with the chips installed altogether, which is odd) I was also able to add a Creative AudioPCI Card I had lying around without too much hassle; Though there are better options for ideal MS-DOS and Windows 95 audio I might explore later. My main limitation so far is the lack of a PS/2 connector for the mouse and I don’t have a serial Mouse- I found an old InPort Mouse with a Serial Adapter on eBay to serve that purpose, however- As having a mouse would be nice.

One thing I was struck by- much as with things like the iMac G3 I wrote about previously, is that despite being quite old, it still performs rather well with things like Office 97. Basically it just proves my theory that if you fit your software choices to the hardware, old hardware is still quite capable. I could write up documents in Word or create spreadsheets in Excel without too much bother and without really missing anything available on a newer system; and the system would work well with older MS-DOS games as well for most titles- and older titles are facilitated by the Turbo Switch, which oddly doesn’t actually do anything with the button but uses Control-Alt-Minus and Control-Alt-Plus to change the speed and the turbo switch light changes accordingly (it goes between 133Mhz and 25Mhz, making the latter about equivalent to a fast 386).

I might even experiment with connecting it to my network,  Perhaps even try to get Win95 able to work with shared directories from Windows 10 which would be rather funny. (Though I suspect I might need to open up security holes like SMBv1 to get that working….)

Posted By: BC_Programming
Last Edit: 30 Oct 2017 @ 06:52 AM

EmailPermalinkComments (0)
Tags

 13 Sep 2017 @ 1:41 AM 

A lot of older games and software have received source ports over the years; furthermore, a lot of those older titles have had remakes or revisions that improve the underlying engine, adding features that were not available at the time the game was created. These capabilities tend to start with Source ports which add additional features, and sometimes a wild source port appears  which is dedicated to introducing modern graphics features and/or high resolution textures. There tends to be two camps when it comes to the “best” way to replay these older games. There are those who argue that these modern source ports “aren’t the way the game was intended to be” and violate a sacrosanct code of ethics regarding playing games, and there are those who think if you can make something look or play better, why not. I fall largely into the latter camp. My logic is that if the capability had been there then the games were more likely to have the enhanced capabilities that we find in source ports, and the lack thereof isn’t a direct indicator of those aspects going directly against design intentions. Furthermore, I reason that if we go down that route about “intent” then we should be forcing many of these games to run in very low resolutions or to struggle with low framerates, in order to match the typical system capabilities of the era.

A few specific titles come to mind when it comes to these sorts of titles. There are plenty of source ports and configurability that allows players to be “pure” and entirely consistent with the original titles, as well as those which appeal to those who want to replay the game with a bit more flair.

Doom

Doom may not be the earliest game to receive this sort of treatment- as Wolfenstein 3D also has seen some similar things- but it was the first “big” one. Doom is one of the earliest examples of rather wide modifications of a game, partly because of how it was structured. This led to an endless supply of WAD files that you can download and play, particularly those constructed for

Doom running via the “Chocolate Doom” Source port.

Doom II. The original DOOM as well as Doom II was an MS-DOS game which ran on, unsurprisingly, MS-DOS. This meant that playing it on later systems, such as Windows 95, 98, and particularly 2000 or XP, meant that you had to deal with quirks from it being an MS-DOS game. Source ports changed this; by taking the original source and porting it to these newer platforms, the game could be run on any number of other platforms that were distinctly not MS-DOS. It wasn’t long before these source ports started to include additional features. For example, Doom Legacy was a source port which, among other things, could have the blood splats that flew off of enemies stick to walls and floors. Other source ports had similar features, extended the engine, or added their own capabilities. Nowadays, there are large modifications to the base game such as Brutal Doom or Project Brutality which significantly alter gameplay. You can use high resolution

Doom running via The GZDoom Source Port

textures and there are even Models that you can use to replace all sprites in the games. Traditionalists haven’t been left out; Chocolate Doom is a source port that runs on pretty much any platform- even MS-DOS itself- which seeks to keep the gameplay as original as possible.

One thing I’ve noticed in particular is that it seems to have it’s own OPL3 Based Music synthesizer built in, whereas other source ports use the system MIDI synth. I’m personally partial to Project Brutality via gzDoom, myself, as I feel it makes playthroughs more interesting and unpredictable because of new, unique enemies and weapons as well as enemy and pickup randomization. I also like to use a tweaked version of Oblige, ObHack, which is a WAD generator which I then modified (the LUA scripts) even further to increase enemy, ammo, and health amounts to ridiculous degrees, though that’s not really here or there.

Duke Nukem 3D

Duke Nukem 3D via eDuke32, using the “Classic” Renderer

Duke Nukem 3D was the 3D followup to, well, Duke Nukem 2. The Misogynist main character and his one liners was right at home in the late

90’s, and it received an expansion (in the form of the Atomic Edition, as well as a Plutonium Pak, which basically changed a normal install into

Duke nukem 3D, via eDuke32, using the Polymer Renderer and High resolution pack.

the Atomic Edition) and several unofficial add-ons such as “Duke it out in D.C”; Like Doom, it also had a vibrant community of level creators, kickstarted, somewhat, by the original CD including the Level Editor, BUILD. It has it’s own share of Source ports, but unlike Doom, there is one that stands head and shoulders about the rest- eDuke32. This port has pretty much any feature you could possibly want; including a brand new 3-D Renderer that allows for not only high resolution textures and models, but features like dynamic lighting, models, and various new image maps like normal and bump maps being supported.

Quake

Quake ended up being quite a different game when released then it was originally advertised in Commander Keen Episode 2:

The “Quake Preview”, then called “A Fight for Justice” found in Commander Keen, Episode 2.

Instead of what was presented here, by the time of it’s eventual release, the game was almost, one could say, a Doom clone; the gameplay was very much the same as Doom, with a slightly

Quake as seen via the relatively faithful Makaqu source port.

different story surrounding it- rather than a Space Marine fighting your way through Hell’s demons, you are fighting interdimensional demons. Of course Quake introduced a brand new, fully 3-D engine, which was one of the earliest implemented as well. At the time, it’s graphics

the same location in-game as seen in the enhanced Darkplaces Source Port.

capabilities were rather ground-breaking. While the DOS game normally used CPU-based Software rendering, paired with support for VESA standard resolution support, You could also get a version which used OpenGL,  (or even a variant which internally converted OpenGL calls to Glide for those users with 3DFX Cards). Quake, Like Doom, was eventually Open Sourced. as a result of this, it, too, has seen innumerable “Source ports”; from egoboosting <name>quake.exe versions where people just slap their screenname or first name in front and add a few features, to full-fledged, complete reworks of the base engine. I’m particular to Darkplaces myself. This was actually part of what spawned this entire post, which is that when I posted a screenshot of Darkplaces with “all the trimmings” it was met with some lukewarm responses about how that “isn’t how it is intended”; Naturally, their recommendation for the “proper” way to play was another source port, which I think I’d most liken to Chocolate Doom, Makaqu.

Quake II

Quake, a game about fighting interdimensional demon hordes, naturally spawned a sequel where you play the role of a space marine in a retaliatory attack against an Alien race. Alright, so it wasn’t really a sequel in anything but name. Even so, the gameplay is very similar.

A scene from Quake 2’s original released product.

And, we see much the same “aftermarket” support appear long after the game had fell from the public eye. One can find enhanced source ports with dynamic lighting and even high resolution texture packs available for it, such as Quake2XP, r1q2, Berserkerq2.

One of the best parts about games where these sort of aftermarket additions and modifications exist is that you can generally play the game on

modern systems. A lot of older games that were held onto more closely by their manufacturers require very specific hardware configurations and setups, or have compatibility issues. This presents a problem for playing those titles today without actually procuring a system built specifically towards those title’s particular limitations. Open Sourced commercial games of “yesteryear” means that people who want to give the game a replay on their new PC can play it without major compatibility concerns as well as crank visuals to 11 if they so desire.

The same scene seen in Quake2XP’s enhanced engine.

Posted By: BC_Programming
Last Edit: 13 Sep 2017 @ 01:41 AM

EmailPermalinkComments (0)
Tags
Categories: Games

 07 Aug 2017 @ 6:24 PM 

A while ago, it came out that Microsoft Paint would be deprecated going forward on Windows 10, replaced, instead, with Paint 3D. There have been loads of articles, forum threads, and general griping about this across the Internet. Nonetheless, Paint is hardly the first “casualty” of Windows as it moved forward; nor is it’s loss, realistically, a big one.

A History

“Paint” existed in some form or another dating back to the original Windows release. Like many parts of Windows, it was based on an existing product, but stripped down. In this case Windows Paintbrush was effectively PC Paintbrush 1.05 for Windows but stripped down so as to not compete with the full product.

Windows 1.04

Paint on Windows 1.04

Aside from a smaller set of tools, it appears that another limitation of the included program is that it can only work with monochrome bitmaps. For the time period, that isn’t a surprising limitation though- The Apple Macintosh’s MacDraw program had a similar color limitation.

Windows /286

PAINT running on Windows /286

Windows/286 didn’t change the included PAINT program very much- I wasn’t able to find any significant differences myself, at least. it seems to have the same limitations. I wasn’t able to get Windows /386 to work however I presume PAINT is the same program between them, being that the major difference is enhancements for the 386.

Windows 3.0

Paintbrush running on Windows 3.0

It was with Windows 3.0 that PBRUSH was effectively created. While still seeming to be based largely on PC Paintbrush, the Windows 3.0 version, aside from changing the program title to “Windows Paintbrush” from “PAINT” as well as the executable, also redesigned part of the User Interface. Interestingly, this interface is more similar to the more complete PC Paintbrush product as provided on Windows /286, but of course it did not provide the full toolset of the commercial product either.

Windows 3.1

Paintbrush on Windows 3.1

PBRUSH didn’t see any significant changes from Windows 3.0. It still had a number of annoying limitations that plagued previous releases; in particular, tools couldn’t work with data outside the visible canvas. This meant you couldn’t even paste a screenshot into the program- It would be cropped. You can see this below- this is after performing a floodfill on the outer area of the above, then scrolling down- the exposed canvas was not affected by the operation.

Win 3.1 Paint floodfill failure

Windows 95

MSPaint on Windows 95

Windows 95 saw PBRUSH deprecated in favour of MSPAINT; Not just deprecated, mind you- but altogether removed; however, you could still invoke PBRUSH, due to a new “App paths” feature of Windows. This capability exists to today- Like Win95 there is no PBRUSH.EXE in Windows 10, but running PBRUSH will start MSPaint, as it has since Windows 95. The new Windows 95 version of Paint is now “Microsoft Paint” rather than “Windows Paintbrush” and sports a new executable as well. It also redesigns the interface to adhere to the new “3D” style that Windows 95 introduced, as well as making use of other Windows features that had been enhanced; for example, while you could edit colors in the older Windows Paintbrush, the program used a set of three sliders for that customization. Windows 95 added a new Custom Color dialog, which Microsoft Paint made use of for customizing the palette entries. Thanks to how that dialog worked it meant you could save several custom colors outside of the normal palette and swap between them, too. It also adds a Status bar, which was coming into it’s own with Windows 95 as a convention; This included “tip” text appearing on the left as well as other information appearing in additional panes on the status bar.

Windows 98

MSPaint on Windows 98SE

Windows 98’s release of Microsoft Paint seems to have removed the ability to load and save Custom Colour Palettes. Additionally, it also dropped the ability to save to the .PCX format, while gaining the ability to use certain installed image filters, allowing it to save to .PNG for example, if certain other software is installed.

Windows ME

MSPaint on Windows ME

The Windows ME version of MSPaint appears to be identical to the Windows 98SE Version, however, the executables are not identical- I’m not sure what difference there might be beyond the header indicating it is for a different Windows Version, though. It’s here for completeness.

Windows 2000

MSPaint on Windows 2000

Another entry for completeness as, like Windows ME, I cannot find any differences between it and the Windows 98SE release of MSPaint.

Windows XP

MSPaint on Windows XP

Windows XP introduced a few major revisions to MSPaint. First, it could acquire information from a Scanner or any TWAIN device (Such as a digital Camera). Moreover, it now had native support for JPEG, GIF, TIFF and PNG File formats, without any additional software installs.

Windows Vista

MSPaint running on Windows Vista

The WIndows Vista release of paint changes the default colour palette, has a set of new tool icons, And Reorganizes some of the UI (the Color palette is moved, for example). It changes the undo stack to 10 deep rather than 3, and saves to JPEG by default- which suggests that it was intended or expected largely to be used for acquiring and saving photos.

Windows 7

MSPaint as included in Windows 7.

Windows 7 is another major overhaul of the program, on the same level as the change from the PaintBrush program in Windows 3.1 to MSPaint in Windows 95. This redesigns the interface around the “Ribbon” concept, and adds a number of capabilities, brushes, and a few tools. It also now has anti-aliasing.

Windows 8

This version is pretty much identical to the Windows 7 release; though there are some minor adjustments to the Ribbon.

Future

Microsoft Paint is now deprecated, but this doesn’t prevent you from using it; even when it is removed from the default installation, it will still be made available as a free download from the store. You can also copy/paste a version of paint from a previous Windows 10 install to avoid dealing with an appx container file or any tracking that comes with using the Windows Store, if desired. I think the fuss over this change is a bit of an overreaction. There are plenty of other free programs that can accomplish the same tasks and while it is a bit annoying to have to download them, Windows will still include Paint 3D which should be capable of the same standard tasks people want the older Paint program for, such as screenshots.

The old PBRUSH application running on Windows 10. It’s a Miracle.

What is this witchcraft? Windows NT 3.51 was 32-bit, but was based around Windows 3.1, so it got a 32-bit version of the same old PBRUSH program from Windows 3.1. That can be copied from an NT 3.51 install and run directly on Windows 10. Pretty interesting- Though of arguably limited usefulness, beyond putting it at the end of blog posts to pad out the length for no reason.

Posted By: BC_Programming
Last Edit: 07 Aug 2017 @ 06:24 PM

EmailPermalinkComments (0)
Tags

 19 Jun 2017 @ 9:07 PM 

A quick ol’ post to recommend this excellent piece of software, For the longest time I just used the built-in WordPress Editor to write, save, edit, etc. Posts. I tried some of the blog writer programs back in 2010 or so, but ended up just going back to directly working via WordPress. I found that more and more I was writing blog posts via EditPad Pro, saving the images for them in the same folder, and then rebuilding them via copy, paste, and media upload tools in WordPress, so I looked at the landscape for Applications that allow a Local software program to be used to edit and “arrange” a post, then post it. Sort of like a Frontpage for Blogs (the negative implications of that comparison notwithstanding, of course…). Open Live Writer is the program I found and have been using as a result of that search, and I’ve been quite happy with it so far. While it doesn’t quite seem to understand the blog theme, it still makes editing and arranging posts quite easy, particularly as I can practically just edit a document like I would a Word Document- insert pictures and all that, and then it will upload them to the WordPress media library automatically. Immensely useful. You can also open drafts and even older posts for editing using the program. It’s quite flexible and now I keep it open in my taskbar, so when a topic pops into my head I can write about it immediately- like this one, for example.

The only downside that I’ve noticed is that since it doesn’t recognize the theme it also doesn’t seem to “get” the crayon syntax highlighter I use, so when editing the code tags are rather boring. Though, they didn’t look any better in the WordPress editor so it’s not like I’ve lost anything, either.

Posted By: BC_Programming
Last Edit: 19 Jun 2017 @ 09:07 PM

EmailPermalinkComments (0)
Tags
Tags:
Categories: Software

 16 Jun 2017 @ 3:43 PM 

Windows 10 introduced a new software development platform- the Universal Windows Platform, or UWP. In some respects it builds upon the earlier Windows Runtime that was introduced with Windows 8. One interesting aspect of the platform is that- properly used- it can be utilized to have software that can be built once and distributed to a number of platforms running Microsoft Operating Systems, such as the XBox One.

I’ve fiddled a bit with UWP but honestly I found it tricky to determine what it’s for; As it is, it’s API as well as set of third-party portable libraries simply isn’t anywhere near a typical Application targeting the Desktop via WPF or even Windows Forms. But I think that is intentional; these aren’t built towards the same purpose. Instead, the main advantage of UWP appears to be in being able to deploy to multiple Windows Platforms. Unfortunately that is an advantage that I don’t think I can really utilize. However, I expect it will be well used for future applications- and it has already been well used for games like Forza Horizon 3, which utilized it for “Play anywhere” so it can be played not only on the XBox console but on any capable Windows 10 system. Forza 7 will also be using it to much the same effect.

Even if I won’t utilize it, it probably makes a lot of sense to cover it. My recent coding-related posts always seem to involve Windows Forms.  Perhaps I should work instead to learn UWP and then cover that learning experience within new posts? If I an encountering these hurdles then I don’t think it is entirely unreasonable to think perhaps others are as well.

I’ve also got to thinking that perhaps I have become  stuck in my ways, as I’m not partial to the approach that appears to bring web technologies to the desktop; Even today I find web applications and UI designed around the web to have a “feel” that is behind a traditional desktop application in usability. That said, I’m also not about to quit my job just because it involves “legacy” frameworks; We’re talking about quite an old codebase- bringing it forward based on library and platform upgrades would mean no time for adding new features that customers actually want. That, and the upgrade path is incredibly murky and unclear; with about 50 different approaches  for every 50 different problems we might encounter, not to mention things like deciding on the Framework versions and editions and such.

I know I was stuck in my ways previously so it’s hardly something that isn’t worth considering- I stuck with VB6 for far too long and figured it fine and these newfangled .NET things were unnecessary and complicated. But as it happens I was wrong about that. So it’s possible I am wrong about UWP; and if so then a lot of the negative discussion about UWP may be started by the same attitude and thinking. Is it that it is something rather large and imposing that I would need to learn that results in me perceiving it so poorly? I think that is very likely.

Which is  not to suggest of course that UWP is perfect and it is I who is wrong for not recognizing it; but perhaps it is the potential of UWP as a platform that I have failed to assess. While there are many shortcomings, future revisions and additions to the Platform are likely to resolve those problems as long as enough developers hop on board. And it does make sense for there to be a reasonable Trust Model where you “Know” what information an application actually uses or requests, rather then it being pretty much either limited user accounts or Administrator accounts and you don’t know exactly what is being used.

It may be time to come up with a project idea and implement it as a UWP application to start that learning experience. I did it for C# and Windows Forms, I did it for WPF, and I don’t see how the same approach couldn’t work for UWP. Unless it’s impossible to learn new stuff after turning 30, which I’m pretty sure is not the case!) If there is a way to execute other programs from UWP, perhaps the Repeater program I’m working on could be adapted. That is a fairly straightforward program.

Posted By: BC_Programming
Last Edit: 19 Jun 2017 @ 08:59 PM

EmailPermalinkComments (0)
Tags
Tags: ,
Categories: Programming

 09 Jun 2017 @ 11:06 PM 

This is part of a occasionally updated series on various programming languages. It should not be interpreted as a benchmark, but rather as a casual look at various programming languages and how they might be used by somebody for a practical purpose.
Currently, there are Articles written regarding Python,C#, Java and VB6 (merged for some reason),Scala,F# & Ruby,Perl, Delphi, PHP,C++,Haskell,D, VB.NET, and even QuickBASIC

QuickBASIC is an out of place choice when compared to most other languages that I’ve written in this series. Why would I jump so far backwards to QuickBASIC?

There are actually an umber of reasons. The first is that QuickBASIC actually imposes a number of limitations. Aside from the more limited programming language compared to, say C#, it also means any solution needs to appropriately contend with issues such as Memory usage and Open File Handles on MS-DOS. At the same time, a lot of the development task is actually more simple; one doesn’t need to fiddle with designers, or property pages or configuration tabs, or anything of that sort. You open a text file and start writing the program.

The first task is to determine an algorithm. Of course, we know the Algorithm- it’s been described previously- However, in this instance, we don’t have hashmaps available; furthermore, even if we want to implement that ourself, we cannot even keep all the information in memory. As a result, one compromise is to instead keep an array of index information in memory; that array can contain the sorted word as well as a record index into another random-access file, so, to start, we have these two TYPE structures:

By writing and reading directly from a scratch file when we need to add a new file to the “hash” we can avoid having any of the SORTRECORD structures in memory except the one we are working with. This drastically reduces our memory usage. As did determining that the longest word in SORTINDEX is 28 characters/bytes. The algorithm thus becomes similar- basically, with a word, we sort the words letters, and then we consult the array of SORTINDEX types. If we find one with the sorted word, we take the OFFSET and we read in the SORTRECORD at that offset, increment wordcount, and add the word to the SORTWORDS array, then PUT it back into the scratch file. And if it isn’t found in the SORTINDEX, we create a new entry- saving a new record with the word to the scratch file and recording the offset and sorted text in the index for that record.

Of course this does have several inefficiencies that I won’t address; The first is that the search for the matching sorted word is effectively a sequential search. Ideally, the in-memory index would be kept sorted and searches could use a Binary Search. I guess if somebody is interested I “left it as an exercise for the reader”.

Otherwise all seems well. But not so fast- the dict.txt file has 45402 words. Our type definition is 32 bytes, which means for all words to be stored in the index, we would need 1,452,864 bytes, which is far beyond the conventional memory limits that we are under. So we need to drastically reduce the memory usage of our algorithm. And we had something so promising! Seems like it’s back to the drawing board.

Or is it? instead of trying to reduce how much our algorithm uses, we could reduce how much data it is working with. At a time. We can split the original dictionary file into chunks, and as it happens since words of different lengths cannot be anagrams of each other, we can merely split the file into separate file organized by length. Then we perform the earlier algorithm on each of those files and output the resulting anagram list of each to one file. That would give us one file listing all anagrams without exceeding memory limitations!

Before we get too excited, let’s make sure that the largest “chunk” would be small enough. using another QuickBASIC program (because, what the hell, right?) I checked over the count of files of particular lengths. In this case, the chunk with the most is words of 7 letters in length, of which there are 7371 in the test dictionary file. This would require 235,872 Bytes of storage, which is well within our 640K conventional memory limit.

Of course, there is a minor caveat; we do need to start QuickBASIC with certain command line arguments, as, by default, the dynamic array maximum is actually 64K. We do this by launching it with the /Ah command line parameter. Otherwise, we might find that it encounters Subscript out of range errors once we get beyond around the 2000 mark for our 32-byte index record type.

Another consideration I encountered was open files. I had it opening all the dictionary output files at once, but it maxed out at 16 files, so I had to refactor it to be much slower by reading a line, determining the file to open, writing the line, and them closing the file. Again, there may be a better technique here to increase performance. For reference, I wasn’t able to find how to increase the limit, either (adjusting config.sys didn’t help).

After that, it worked a treat- the primary algorithm runs on each length subset, and writes the results to an output file.

Without further Adieu- The Full source of this “solution”:

And there you have it. an Anagram search program written in QuickBASIC. Of course, it is arather basic and is a bit picky about preconditions (hard-coded for a specific file, for example) but it was largely written against my test VM.

Posted By: BC_Programming
Last Edit: 09 Jun 2017 @ 11:06 PM

EmailPermalinkComments (0)
Tags
Categories: Programming





 Last 50 Posts
 Back
Change Theme...
  • Users » 45575
  • Posts/Pages » 372
  • Comments » 105
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight

PP



    No Child Pages.

Windows optimization tips



    No Child Pages.

Soft. Picks



    No Child Pages.

VS Fixes



    No Child Pages.