|
OptExpressions | ||||||||||||||
OptExpressions Mode | ||||||||||||||
Parameters: Mode = Combined optimizer mode (Defaults 1) |
||||||||||||||
Returns: NONE | ||||||||||||||
OptExpressions is a compile time switch, much like Explicit, that users can select the optimization mode anywhere you like (within reason). The switch controls the activity of a group of redundancy optimizations that the PlayBASIC compiler can apply to expressions and move operations during code generation. These optimizers help remove redundant operations from your programs, without altering the behavior of the program, which improves it's performance. When first added back in 2006, the optimizer was an optional feature then, needing to be activated by the user incase of teething problems. Soon after it became activated by default. Now since the optimizers job, is to make your programs faster, there wasn't really much point of turning it off. In PlayBASIC V1.64N2 (May 2012) we introduced another layer of Type & Array Caching features to the compilers optimizer. These features can be summarized as a caching mechanic that remembers common accesses of array and type fields within expressions and across lines. This optimization mode is biased towards Type Read/Write accesses, but unlike the redundancy optimizations, Type Caching isn't activated by default. The reasoning for this is somewhat complicated to explain, but comes down to one of the legacy features of PlayBASIC runtime allowing users to Read types that are yet to formaly created. Programs that depend upon this feature can experience problems when Type Caching is enable. So rather than remove a legacy feature, the only logical conclusion was to disable Type Caching by default. Read the articles bellow for it's safe usage. Optimization Modes There's currently two supported optimizations modes that can activated individually or together. 1 (Bit 0) = General Redundant Variable & Move Short Cutting optimizations 2 (Bit 1) = Full Type Caching, cache sequential type/arrays accesses across lines and throughout expressions. The optimizers default setting is a value of 1, if you wanted to enable Type Caching also, we set use OptExpression 3. We can disable them both with OptExpression 0 which will disable most of the optimization. Internally, there's not actually two optimization methods, but rather a small army of them, many of which aren't optional. So turning optimization off won't have any affect on them. I N D E X:
Why is optimization needed ? If you started game programming back in the 1980's, then you would have probably jumped head first into a land of Assembly Language. Assembly language is human form of the computers CPU instruction sets and considered by many to be one the hardest ways of programming. There are many reasons for this, like each CPU has a different set of instructions, but primarily it's difficult, because it's so simple. Things we take for granted in high level programming languages like BASIC, have to be painstaking programmed in. So programming in Assembly, is often a very slow and tedious process. Which no doubts begs the question, If it's so slow, why do people use it ? - Well, speed... Assembly will generally give you the best performance for your program, more than compensating for the tedious nature. Today, it's rare to run into assembly programmers, since many have moved onto slightly higher level languages like C/C++ , JAVA or BASIC, as modern computer CPU's are so fast, a little extra bloat from a high level language isn't going to ruin our programs performance. What high level languages have in common, is they're more readable, making them more abstracted from Assembly Language. As the relationship between the high level code we enter and Assembly code gets wider, the more assembly language instructions it takes to do the equivalent operation in a high level language. The price we way for this ease is speed, the more abstracted, the slower our languages get. Now knowing this, modern languages started to introduce optimization modes, where the objective is to take the users high level languages code and try and represent this action in the least amount of Assembly Code possibly. The idea being, to make our high level code closer to the what hand written assembly might be, and therefore improving the performance of our high level language. In PlayBASIC, optimization is something we're generally not aware of, from a programmers point of view. The compiler takes your code and attempts to remove as much redundant code as possible, while not blowing out the compile times. The optimize process covers many things, from pre-evaluation of literal math operations, pre-evaluation of constant functions (Built in functions only) from expressions, redundant variable removal from expressions, some loop/comparison shortcutting, some built in function short cutting, down to array and type caching. Most of the optimizations take place during expression evaluation, so the optimizer generally doesn't take a scope wide view, with the exception for Type Caching. What all of this does is make your PlayBASIC programs run faster than they would without it. User Optimization Techniques While the optimization features found in modern compilers have come a long way, they're generally no match for a well crafted solution. So in order to get peek performance from your application, it's well worth reading up on various optimizations techniques that can applied to game programming. You'll find various tricks and tips throughout the help files and of course on the forums (www.PlayBASIC.com) - When in doubt don't be bashful, ask somebody on the forums for ideas about how they' solve a particular problem. Type Caching Before PlayBASIC V1.64N2, the PlayBASIC compiler wasn't able to globally remove consecutive read/write accesses of user defined types. So virtually every time you read or wrote to field, the compiler would output the required byte code to perform this action. The output code has to check if the array/variables exists and work out where in memory the field data we want to access is. The thing is this work is generally redundant, since we're often accessing the same type over and over. What the compilers Type Caching mode does, is it detects situations where our code is making multiples accesses of the same type, then replaces any consecutives read/writes with a short hand version. So the first access is as normal, but after that, it's read/writing from the cached data, speeding up that section of code. To get the best out of the current implementation of type caching, there's a few situations we need to be aware of. Firstly, caching is biased towards code that's accesses the same typed (array/variable) either within an expression or across multiple lines. That means if you mix reading/writing between different types or arrays, then the optimizer would be unable to cache these safely, and will revert back to the normal read/write method. Secondly, user defined function calls will reset a cached type. The reason for this is that the compiler doesn't know if the function alters our current cache data or not, so it takes the safe road, even though it's probably unlikely to cause and issues most of the time. Knowing this, we can rearrange any time critical code so that it exploits caching to the make executions as fast as possible. Let's assume we've declared a type called 'Character' and then dimensioned two variables, Player and Alien of type character. For example,
Ok, so that's some set up, now lets take a look at some everyday usage code. Here we're just setting up the fields in both our Player and Alien typed variables.
When the compiler sees the first write usage into Player (Player.X) with type cache mode is enabled, it'll output a normal write, then active caching for any following reads/writes of Player fileds. So the writes to field Player.Y,Player.SpeedX and Player.SpeedY have been simplified. When the compiler gets the Alien.x line, it detects the change in variable which stops caching of Player, then returns the default long hand method. So the first time it sees alien, it drops the normal write code for that operation, then caches the following writes to Alien.Y, Alien.SpeedX and Alien.SpeedY Caching works best on code that has serialized usage focusing around one type. So the following expressions resolve down into an initial long hand read of the Player, but beyond that, it'll use a cached short short for successive reads and writes of Player, even across lines. The same for Alien.
Now, the optimizer prefers your accesses to be serialized, the more of these you can sequence the easier the job for the compiler and ultimately the PlayBASIC runtime.
Just for the sake of comparison, here we've reordered the previous snippet. This version would execute the same, but we're missing out on caching opportunities, making the code execute that little bit slower. The performance impact of that, depends entirely upon how frequently this section of code was run. Frequently used passage of code are going to take more up more execution time, than a section that's only use once or twice. Cache Flushing Behaviors For caching to work reliably, the compiler takes a pretty defensive approach, choosing to flush the cache when it encounters user defined functions calls, some built in functions as well as comparisons, branches and even loops. Calling a user defined function/psub creates a problem as the compiler doesn't actually know if the function will alter the last accessed type or not. Creating a situation where it's some times safe and sometimes not, so we select to play it safe. Built in functions are a bit different, since we know if they potentially alter a typed array/list/variable or not. So any time you use a function that needs an array parameter, it'll most likely flush the previous cache also. If it's doesn't, then caching continues. Some other situations that flush the cache can occur when two or more lines of code are separated by some loop / comparison structure. For example, if we have two lines of code that write to our variable Player, in the X and Y fields, where the lines are separated by a label, or an EndIF say, then compiler will flush the cache between the writes. The reason for this is the second write is dependant upon the first, so if you jump over the first write, the program would most likely crash. Eg
The compiler takes the same precautions when code overlaps the end of the if/endif statement or Loop closing statement etc. Type Caching Requires Manual Type Creation During the implementation of the Type Caching optimization mode, we run into something of a limitation, where the optimizer could break some sequences of code due to by product of the runtime allowing the user to read from none existent type structures. While not feature I'm a fan of, we're basically stuck with it today. Dunno what that is ?, well imagine we have an array of user defined types. When we DIM the array we're creating the 'container' for the types to live within, but we're not actually creating individual typs. The individual Type structures aren't allocated until you either alloc one via NEW, or WRITE to the structure, which makes PlayBASIC automatically allocate a type of the 'arrays type'. Ie.
So after running the two writes, we end up with two shiny new types in our array. All the others are yet to allocated, and thus empty. Now the thing is, often when iterating through a Typed array, we might us a loop that looks something like this.
Here the PlayBASIC Runtime is polling every cell in our things() array. Now since only two of them exist, wouldn't this give illegal accesses ?, Yeah, it should.. But It was decided that the runtime should ignore those illegal reads, rather than pop a runtime error on them. Now given we're talking about decisions from ten or more years ago, it's a bit difficult to revert that today, without breaking many older programs. The recommended solution is simple, rather than access the field of the type, we should always query each cell of the typed array to see if it exists or not. If it exists, the query will be any other value than zero, if it's empty, it'll be zero. Like This,
This method is perfectly safe and is preferred structure of such loops today, but I don't think that was in the original editions. So while a lot of modern code tests the existence of the type structure prior to using it, some older examples / routines don't. This is not a big deal in a world without caching or when it's disabled, but caching will only work consistently when the run time knows for sure, the thing that was last accessed, exists. So such legacy behavior, can make enabling caching a little shakey in older PlayBASIC programs. Which is why it's disabled by default in PlayBASIC V1.64N2 releases. The only way to make Type Caching the default optimization mode in PlayBASIC would be to disallow the user from reading types that don't exist. We tried this, but as expected, an awful lot of legacy programs wouldn't work. While they could be altered, it was decided we'd just leave the existing behavior in classic versions of PlayBASIC, and implement such changes to PlayBASIC FX runtimes. Much less of a culture shock that way... FACTS: * OptExpressions mode can be changed at any time through a program. Allowing you activate it for some sections and not others. * The Type Caching features are turned off by default in PlayBASIC V1.64. See above for available modes * Type Caching works with typed variables, typed arrays and linked lists. Example: This example is timing a group of simple operations applied to typed variables and typed 2D / 3D arrays. Each test is running 25,000 times and includes a three read/writer operations per test. In bandwidth terms each test is reading/ writing (25000*3*4)=300,000 bytes, that's a lot of memory access. All this fetching is overkill, but its useful for demonstrating how much of a difference caching can make to such fragments of code. On my test machine, executing the program type caching is about 50% faster than when it's disabled. So well worth using !
|
Related Info: | ArrayBasics | Explicit | ForceDeclaration | Types | Variables : |
|
|||||||||||||||||||||||||||||||||||||||
(c) Copyright 2002 - 2024 - Kevin Picone - PlayBASIC.com |