Thursday, November 2, 2017

Using RushMode Query optimization to speed data access


First of, I did not make a typo when I said RushMode and not Rushmore; most of VFP users know the data access technique used by  the Fox Software when it was first introduce in FoxPro 2.0 and called it Rushmore technology, but frankly, I don't have any knowledge of how FoxPro Rushmore technology works "internally" inside VFP, but Managed Dbf RushMode is on the same boat as with Rushmore technology to speed things up in data access and has something to do in Query optimization.

Managed Dbf RushMode can optimize LINQ by using a pre condition RushWhile, RushWhere and RushWhileWhere overload parameters on LINQRush when querying, to help optimize query performance and data access by utilizing available table's index file.

Whether an index file associated with table is present or not, by using LINQRush, reading to disk will assure that it will meet the 512 or 4K sector alignment before reading the disk and whether the host disk has a 512 fixed bytes in each sector or for the latest standard  4096 bytes per sector "4K sector" it ensure sector bytes alignment; internally it has a file block disk caching for reading and writing to file more efficiency to the hard disk and record rows caching is  done in the background before rows yielding take place.

Now, let's see the code in action with a task to find all sales invoice of a particular customer with sales invoice amounting to $1,000.00 or more, it will be using Customer and Sales Master tables.







.
,
.
.
SLOWEST 

This is the slowest implementation a typical LINQ commands; querying with unspecified columns and not using RushWhere, RushWhile or RushWhileWhere  LINQRush overloads.
.

.


SLOW


This is slow but not the slowest implementation, again a typical LINQ commands querying but with specified columns and still not using RushWhere and RushWhile techniques, specifying columns will save some memory and may speed up the process.
.

.


FAST

This is a fast implementation but not the fastest it utilize the use RushWhere as a pre condition before yielding record rows to LINQ.
.

.


FASTEST

This is the fastest implementation; first of, it search the particular customer on customer table and if  exist or found it captured the record row no. and do the same thing on sales master file.

On customer table LINQRush use RushWhile and set the while condition, index tag name and specify to start at specific record position, even with millions of records this will only read a single record on disk when querying take place, since the record pointer is already positioned and then immediately terminate the query on customer when the next record does not meet the while condition.

On sales table when Joining, LINQRush use RushWhileWhere overload and set the while and where condition, index tag name as the current index order to be use and the specified to starting at specific position, this will read the record while the while-condition is true and filter the yielding row only if where-condition is true.

This will read all of the particular customer record on sales table but starting at specified record position with respect to specified index order and yield only if invoice amount is greater $1,000.00.

Assuming the sales table has millions of records and the particular customer has only 100 records in the sales table this will only read 100 rows not millions and say 50% of it's records has an invoice amount reaches $1,000.00  it will only yield 50 rows. in short this  will read 100 rows only out of millions and yields 50 rows only.

.

Houston, we've had a problem here!


We're running out of memory using LINQ on gathering large amount of records, assuming a particular customer has thousands of records that meets the query criteria; with high volume of records to collect we might hit the OutOfMemoryException, when working on large records it might be a good idea to save it to a file rather than to memory. Managed Dbf can still use the xBase style of collating a record without consuming too much memory by using tables navigation technique to scan records in table.



.



2 comments:

  1. Hey bro! enough for the rushmode and rush this project ASAP LOL, this might be a great contender in developing an embedded database application for desktop.

    ReplyDelete
    Replies
    1. It has lots of possibilities and still needs some work bro ^ _^ y not yet ready for production use ATM.

      Delete