ClientDataSet - Very poor Performance when used as in memory table??

Hello Everyone,

I have an app which reads in some information - and then writes it
out.  

I was writing it out to a ClientDataSet - being used as a simple
InMemory table.  I have defined fields and four indexes (all simple
one field indexes).  

I also write it out to another Database system - in this case I was
using AdvantageDatabase.  

The surprising thing was that it took .290 seconds to write it out to
the advantage database - and over 3.5 seconds to write it out to the
ClientDataSet.  Both tables are defined almost exactly - same fields,
same indexes...  

So - any thoughts as to why writting out to a CDS is over 10 times
slower than writing out to a database (actually updating the files on
disk as well).  It doesn't make sense that it would slower - since it
is writing out to memory and not right out to disk.  Unless I am
missing something??

Has anyone used another in memory table component? If so - was it
faster? 


Bradley MacDonald
brad_AT_timeacct_DOT_com
0
Bradley
9/21/2009 9:09:31 PM
embarcadero.delphi.dbexpress 764 articles. 0 followers. Follow

10 Replies
2000 Views

Similar Articles

[PageSpeed] 47

Bradley MacDonald wrote:

> So - any thoughts as to why writting out to a CDS is over 10 times
> slower than writing out to a database (actually updating the files on
> disk as well).  It doesn't make sense that it would slower - since it
> is writing out to memory and not right out to disk.  Unless I am
> missing something??

Try setting LogChanges to False.

-- 
Erick Sasse
0
Erick
9/21/2009 9:13:58 PM
On Mon, 21 Sep 2009 14:13:58 -0700, Erick Sasse <esasse@gmail.com>
wrote:

>Bradley MacDonald wrote:
>
>> So - any thoughts as to why writting out to a CDS is over 10 times
>> slower than writing out to a database (actually updating the files on
>> disk as well).  It doesn't make sense that it would slower - since it
>> is writing out to memory and not right out to disk.  Unless I am
>> missing something??
>
>Try setting LogChanges to False.
Eric,
good suggestion - but no difference at all.   

Bradley
0
Bradley
9/21/2009 9:55:37 PM
> Has anyone used another in memory table component?

Yes. How many records are you inserting?

> If so - was it faster? 

Have you tried the same process WITHOUT the indexes?

Nowadays I can insert 50,000 records in a CDS within 3 or 4 seconds, using a modest machine.

Best regards
0
Alexandre
9/22/2009 12:02:58 AM
On Mon, 21 Sep 2009 17:02:58 -0700, Alexandre Machado <> wrote:

>> Has anyone used another in memory table component?
>
>Yes. How many records are you inserting?
>
>> If so - was it faster? 
>
>Have you tried the same process WITHOUT the indexes?
>
>Nowadays I can insert 50,000 records in a CDS within 3 or 4 seconds, using a modest machine.
>
>Best regards
Another Good Suggestion - but it makes no difference...   

I am inserting only 750 records - and it takes 3.5+ seconds.  I must
have something set wrong somewhere


The code to append is very simple itself...   

    cds_DuplicateFiles.Append;

    cds_DuplicateFiles.FieldByName('FileName').AsString       :=
TAFileScanner.CurrFile.Name;
    cds_DuplicateFiles.FieldByName('FilePath').AsString       :=
TAFileScanner.CurrFile.Path;
    cds_DuplicateFiles.FieldByName('FileSize').AsInteger      :=
TAFileScanner.CurrFile.Size;
    cds_DuplicateFiles.FieldByName('FileAttr').AsInteger      :=
TAFileScanner.CurrFile.Attr;
    cds_DuplicateFiles.FieldByName('FileOwner').AsString      :=
TAFileScanner.CurrFile.Owner;
    cds_DuplicateFiles.FieldByName('FileCrtDate').AsDateTime  :=
FileDatetoDateTime(TAFileScanner.CurrFile.CrtTime);
    cds_DuplicateFiles.FieldByName('FileModDate').AsDateTime  :=
FileDatetoDateTime(TAFileScanner.CurrFile.ModTime);
    cds_DuplicateFiles.FieldByName('FileAccDate').AsDateTime  :=
FileDatetoDateTime(TAFileScanner.CurrFile.AccTime);

    cds_DuplicateFiles.FieldByName('FileDupKey').AsInteger    := 0;
    cds_DuplicateFiles.FieldByName('FileDupName').AsString    := '';
    cds_DuplicateFiles.FieldByName('FileDupCode').AsString    := '';

    cds_DuplicateFiles.FieldByName('FileKey').AsInteger     :=
UniqueKey;

    cds_DuplicateFiles.Post;
0
Bradley
9/22/2009 12:20:38 AM
Bradley MacDonald wrote:

> 
>     cds_DuplicateFiles.FieldByName

What Delphi version?
FieldByName with unicode versions is a lot slower than previous
versions.

And even with old versions I suggest you to use persistent fields or
use Fields[Index]
if you dont know the indexes, save before the loop using FieldByName.


Cesar Romero
0
Cesar
9/22/2009 12:25:34 AM
On Mon, 21 Sep 2009 17:25:34 -0700, Cesar Romero <cesar@liws.com.br>
wrote:

>Bradley MacDonald wrote:
>
>> 
>>     cds_DuplicateFiles.FieldByName
>
>What Delphi version?
>FieldByName with unicode versions is a lot slower than previous
>versions.
>
>And even with old versions I suggest you to use persistent fields or
>use Fields[Index]
>if you dont know the indexes, save before the loop using FieldByName.
>
>
>Cesar Romero
well - I have changed it to static fields - and it is exactly the
same...   3.5+ seconds to append 750 records.

Bradley
0
Bradley
9/22/2009 12:32:05 AM
Bradley MacDonald wrote:

> well - I have changed it to static fields - and it is exactly the
> same...   3.5+ seconds to append 750 records.

Very strange all tips here make difference in my machine, every one
helps to improve performance.
And you always tell it is still the same, 
- Is the ClientDataSet linked to visual components?
  use DisableControls/EnableControls
- Is ClientDataSet triggering any event?

What is your Delphi version?
Did you try to use any profiler tool and check where is the problem?
Cant it be in your class TAFileScanner?

I wrote a test here, and ran inside IDE Delphi 2009 - Unicode Debug
build, the results:

- 750 records
FieldByName: 47ms
Peristent Fields: 16ms

- 10.000 records
FieldByName: 297ms
Persistent Fields: 234ms

- 50.000 records
FieldByName: 1669ms
Persistent Fields: 1232ms


Then I build a release version and ran outside IDE:
100.000
FieldByName: 3151ms
Persistent Fields: 2356ms

Only with 100.000 records my time is close of yours, you should have
others problems there.

And here is the code:

  function CreateField(DataSet: TDataSet; FieldClass: TFieldClass;
    const FieldName: string = ''): TField;
  begin
    Result:= FieldClass.Create(DataSet);
    Result.FieldName:= FieldName;
    if Result.FieldName = '' then
      Result.FieldName:= 'Field' + IntToStr(DataSet.FieldCount +1);
    Result.FieldKind := fkData;
    Result.DataSet:= DataSet;
    Result.Name:= DataSet.Name + Result.FieldName;
    if Result is TStringField then
      Result.Size:= 255; // apenas lembrar de definir TField.Size
  end;

procedure TForm31.Button1Click(Sender: TObject);
var
  I: Integer;
  Start: Cardinal;
  FileName    : TStringField ;
  FilePath    : TStringField ;
  FileSize    : TIntegerField;
  FileAttr    : TIntegerField;
  FileOwner   : TStringField ;
  FileCrtDate : TDateField   ;
  FileModDate : TDateField   ;
  FileAccDate : TDateField   ;
  FileDupKey  : TIntegerField;
  FileDupName : TStringField ;
  FileDupCode : TStringField ;
  FileKey     : TIntegerField;
begin
   cds_DuplicateFiles.Close;
   cds_DuplicateFiles.Fields.Clear;
   FileName    := CreateField(cds_DuplicateFiles, TStringField ,
'FileName'   ) as TStringField ;
   FilePath    := CreateField(cds_DuplicateFiles, TStringField ,
'FilePath'   ) as TStringField ;
   FileSize    := CreateField(cds_DuplicateFiles, TIntegerField,
'FileSize'   ) as TIntegerField;
   FileAttr    := CreateField(cds_DuplicateFiles, TIntegerField,
'FileAttr'   ) as TIntegerField;
   FileOwner   := CreateField(cds_DuplicateFiles, TStringField ,
'FileOwner'  ) as TStringField ;
   FileCrtDate := CreateField(cds_DuplicateFiles, TDateField   ,
'FileCrtDate') as TDateField   ;
   FileModDate := CreateField(cds_DuplicateFiles, TDateField   ,
'FileModDate') as TDateField   ;
   FileAccDate := CreateField(cds_DuplicateFiles, TDateField   ,
'FileAccDate') as TDateField   ;
   FileDupKey  := CreateField(cds_DuplicateFiles, TIntegerField,
'FileDupKey' ) as TIntegerField;
   FileDupName := CreateField(cds_DuplicateFiles, TStringField ,
'FileDupName') as TStringField ;
   FileDupCode := CreateField(cds_DuplicateFiles, TStringField ,
'FileDupCode') as TStringField ;
   FileKey     := CreateField(cds_DuplicateFiles, TIntegerField,
'FileKey'    ) as TIntegerField;
   cds_DuplicateFiles.CreateDataSet;
   cds_DuplicateFiles.LogChanges:= False;

   Start:= GetTickCount;
   for I := 1 to 750 do
   begin
     cds_DuplicateFiles.Append;
     cds_DuplicateFiles.FieldByName('FileName').AsString       :=
'TAFileScanner.CurrFile.Name';
     cds_DuplicateFiles.FieldByName('FilePath').AsString       :=
'TAFileScanner.CurrFile.Path';
     cds_DuplicateFiles.FieldByName('FileSize').AsInteger      :=  I;
     cds_DuplicateFiles.FieldByName('FileAttr').AsInteger      :=  I;
     cds_DuplicateFiles.FieldByName('FileOwner').AsString      :=
'TAFileScanner.CurrFile.Owner';
     cds_DuplicateFiles.FieldByName('FileCrtDate').AsDateTime  :=  Now;
     cds_DuplicateFiles.FieldByName('FileModDate').AsDateTime  :=  Now;
     cds_DuplicateFiles.FieldByName('FileAccDate').AsDateTime  :=  Now;
     cds_DuplicateFiles.FieldByName('FileDupKey').AsInteger    := I;
     cds_DuplicateFiles.FieldByName('FileDupName').AsString    := '';
     cds_DuplicateFiles.FieldByName('FileDupCode').AsString    := '';
     cds_DuplicateFiles.FieldByName('FileKey').AsInteger       := I;
     cds_DuplicateFiles.Post;
   end;

   Memo1.Lines.Add(Format('FieldByName: %dms', [GetTickCount - Start]));
   Application.ProcessMessages;

   Start:= GetTickCount;
   for I := 1 to 750 do
   begin
     cds_DuplicateFiles.Append;
      FileName    .AsString       :=  'TAFileScanner.CurrFile.Name';
      FilePath    .AsString       :=  'TAFileScanner.CurrFile.Path';
      FileSize    .AsInteger      :=  I;
      FileAttr    .AsInteger      :=  I;
      FileOwner   .AsString       :=  'TAFileScanner.CurrFile.Owner';
      FileCrtDate .AsDateTime     :=  Now;
      FileModDate .AsDateTime     :=  Now;
      FileAccDate .AsDateTime     :=  Now;
      FileDupKey  .AsInteger      := I;
      FileDupName .AsString       := '';
      FileDupCode .AsString       := '';
      FileKey     .AsInteger      := I;
     cds_DuplicateFiles.Post;
   end;
   Memo1.Lines.Add(Format(Persistent Fields: %dms', [GetTickCount -
Start]));
end;


Cesar Romero
0
Cesar
9/22/2009 1:02:31 AM
On Mon, 21 Sep 2009 18:02:31 -0700, Cesar Romero <cesar@liws.com.br>
wrote:

>Bradley MacDonald wrote:
>
>> well - I have changed it to static fields - and it is exactly the
>> same...   3.5+ seconds to append 750 records.
>
>Very strange all tips here make difference in my machine, every one
>helps to improve performance.
>And you always tell it is still the same, 
>- Is the ClientDataSet linked to visual components?
>  use DisableControls/EnableControls
>- Is ClientDataSet triggering any event?
>
>What is your Delphi version?
>Did you try to use any profiler tool and check where is the problem?
>Cant it be in your class TAFileScanner?
>
>I wrote a test here, and ran inside IDE Delphi 2009 - Unicode Debug
>build, the results:
>
>- 750 records
>FieldByName: 47ms
>Peristent Fields: 16ms
>
>- 10.000 records
>FieldByName: 297ms
>Persistent Fields: 234ms
>
>- 50.000 records
>FieldByName: 1669ms
>Persistent Fields: 1232ms
>
>
>Then I build a release version and ran outside IDE:
>100.000
>FieldByName: 3151ms
>Persistent Fields: 2356ms
>
>Only with 100.000 records my time is close of yours, you should have
>others problems there.
>
>And here is the code:
>
>  function CreateField(DataSet: TDataSet; FieldClass: TFieldClass;
>    const FieldName: string = ''): TField;
>  begin
>    Result:= FieldClass.Create(DataSet);
>    Result.FieldName:= FieldName;
>    if Result.FieldName = '' then
>      Result.FieldName:= 'Field' + IntToStr(DataSet.FieldCount +1);
>    Result.FieldKind := fkData;
>    Result.DataSet:= DataSet;
>    Result.Name:= DataSet.Name + Result.FieldName;
>    if Result is TStringField then
>      Result.Size:= 255; // apenas lembrar de definir TField.Size
>  end;
>
>procedure TForm31.Button1Click(Sender: TObject);
>var
>  I: Integer;
>  Start: Cardinal;
>  FileName    : TStringField ;
>  FilePath    : TStringField ;
>  FileSize    : TIntegerField;
>  FileAttr    : TIntegerField;
>  FileOwner   : TStringField ;
>  FileCrtDate : TDateField   ;
>  FileModDate : TDateField   ;
>  FileAccDate : TDateField   ;
>  FileDupKey  : TIntegerField;
>  FileDupName : TStringField ;
>  FileDupCode : TStringField ;
>  FileKey     : TIntegerField;
>begin
>   cds_DuplicateFiles.Close;
>   cds_DuplicateFiles.Fields.Clear;
>   FileName    := CreateField(cds_DuplicateFiles, TStringField ,
>'FileName'   ) as TStringField ;
>   FilePath    := CreateField(cds_DuplicateFiles, TStringField ,
>'FilePath'   ) as TStringField ;
>   FileSize    := CreateField(cds_DuplicateFiles, TIntegerField,
>'FileSize'   ) as TIntegerField;
>   FileAttr    := CreateField(cds_DuplicateFiles, TIntegerField,
>'FileAttr'   ) as TIntegerField;
>   FileOwner   := CreateField(cds_DuplicateFiles, TStringField ,
>'FileOwner'  ) as TStringField ;
>   FileCrtDate := CreateField(cds_DuplicateFiles, TDateField   ,
>'FileCrtDate') as TDateField   ;
>   FileModDate := CreateField(cds_DuplicateFiles, TDateField   ,
>'FileModDate') as TDateField   ;
>   FileAccDate := CreateField(cds_DuplicateFiles, TDateField   ,
>'FileAccDate') as TDateField   ;
>   FileDupKey  := CreateField(cds_DuplicateFiles, TIntegerField,
>'FileDupKey' ) as TIntegerField;
>   FileDupName := CreateField(cds_DuplicateFiles, TStringField ,
>'FileDupName') as TStringField ;
>   FileDupCode := CreateField(cds_DuplicateFiles, TStringField ,
>'FileDupCode') as TStringField ;
>   FileKey     := CreateField(cds_DuplicateFiles, TIntegerField,
>'FileKey'    ) as TIntegerField;
>   cds_DuplicateFiles.CreateDataSet;
>   cds_DuplicateFiles.LogChanges:= False;
>
>   Start:= GetTickCount;
>   for I := 1 to 750 do
>   begin
>     cds_DuplicateFiles.Append;
>     cds_DuplicateFiles.FieldByName('FileName').AsString       :=
>'TAFileScanner.CurrFile.Name';
>     cds_DuplicateFiles.FieldByName('FilePath').AsString       :=
>'TAFileScanner.CurrFile.Path';
>     cds_DuplicateFiles.FieldByName('FileSize').AsInteger      :=  I;
>     cds_DuplicateFiles.FieldByName('FileAttr').AsInteger      :=  I;
>     cds_DuplicateFiles.FieldByName('FileOwner').AsString      :=
>'TAFileScanner.CurrFile.Owner';
>     cds_DuplicateFiles.FieldByName('FileCrtDate').AsDateTime  :=  Now;
>     cds_DuplicateFiles.FieldByName('FileModDate').AsDateTime  :=  Now;
>     cds_DuplicateFiles.FieldByName('FileAccDate').AsDateTime  :=  Now;
>     cds_DuplicateFiles.FieldByName('FileDupKey').AsInteger    := I;
>     cds_DuplicateFiles.FieldByName('FileDupName').AsString    := '';
>     cds_DuplicateFiles.FieldByName('FileDupCode').AsString    := '';
>     cds_DuplicateFiles.FieldByName('FileKey').AsInteger       := I;
>     cds_DuplicateFiles.Post;
>   end;
>
>   Memo1.Lines.Add(Format('FieldByName: %dms', [GetTickCount - Start]));
>   Application.ProcessMessages;
>
>   Start:= GetTickCount;
>   for I := 1 to 750 do
>   begin
>     cds_DuplicateFiles.Append;
>      FileName    .AsString       :=  'TAFileScanner.CurrFile.Name';
>      FilePath    .AsString       :=  'TAFileScanner.CurrFile.Path';
>      FileSize    .AsInteger      :=  I;
>      FileAttr    .AsInteger      :=  I;
>      FileOwner   .AsString       :=  'TAFileScanner.CurrFile.Owner';
>      FileCrtDate .AsDateTime     :=  Now;
>      FileModDate .AsDateTime     :=  Now;
>      FileAccDate .AsDateTime     :=  Now;
>      FileDupKey  .AsInteger      := I;
>      FileDupName .AsString       := '';
>      FileDupCode .AsString       := '';
>      FileKey     .AsInteger      := I;
>     cds_DuplicateFiles.Post;
>   end;
>   Memo1.Lines.Add(Format(Persistent Fields: %dms', [GetTickCount -
>Start]));
>end;
>
>
>Cesar Romero
Cesar,

Well - disable Controls...  worked wonders!!  it is such a small set -
I didn't disable them while I have been creating the form.  I think I
was too close to the problem :)  

Thank you very much.  Send me a quick email at
brad_AT_timeacct_DOT_com.  I appreciate the help.  I will send you a
free key to the shareware...  

Bradley MacDonald
0
Bradley
9/22/2009 1:45:51 AM
Bradley MacDonald wrote:

> Hello Everyone,
> 
> I have an app which reads in some information - and then writes it
> out.  
> 
> I was writing it out to a ClientDataSet - being used as a simple
> InMemory table.  I have defined fields and four indexes (all simple
> one field indexes).  
> 
> I also write it out to another Database system - in this case I was
> using AdvantageDatabase.  
> 
> The surprising thing was that it took .290 seconds to write it out to
> the advantage database - and over 3.5 seconds to write it out to the
> ClientDataSet.  Both tables are defined almost exactly - same fields,
> same indexes...  
> 
> So - any thoughts as to why writting out to a CDS is over 10 times
> slower than writing out to a database (actually updating the files on
> disk as well).  It doesn't make sense that it would slower - since it
> is writing out to memory and not right out to disk.  Unless I am
> missing something??
> 
> Has anyone used another in memory table component? If so - was it
> faster? 
> 
> 
> Bradley MacDonald
> brad_AT_timeacct_DOT_com

If clientdataset uses the Midas.dll (I think it does but am not sure),
then another option that may help is to link midaslib directly into
your app.  I read or was told somewhere that this can increase
performance because you are running midas against the new fastmm code.
Just a thought.  Not sure if it would help.
0
Bob
9/22/2009 2:26:02 AM
Glat do see that your problem is solved.

If you want to put a massive amout of records in a CDS you should consider using Andreas Hausladen's MidasSpeedFix: It gives Midas a linear performance when the number of records grows: http://andy.jgknet.de/blog/?p=444

Best regards
0
Alexandre
9/22/2009 9:48:05 AM
Reply:

Similar Artilces:

DBExpress terrible performance when migrating from Delphi 7 to Delphi 2007
Hi, When I'm migrating my project from Delphi 7 to Delphi 2007, I found that the speed slow down 3-4 times. I've started to investigate what is the reason of that and I've found that the problem is in the TSQLDataset component. So I make a simple example of an application that run one of my problem queries that fetches about 30000 rows and the result was amazing d7: 1500ms, d2007: 13500ms 8 times slower !!!! Here are some perameters of the TSQLConnection Delphi 7: object SQLConn: TSQLConnection ConnectionName = 'OracleConnection' DriverName = '...

Poor Performance using TSA Test and overall NSS performance.
This is a local test on a RAID 1E system with 12 SCSI Ultra320 146GB@10K drives. SMS - TSA Statistical Analyzer Backing Up :DATA2: Read Count: 13266 Min. Read Time: 000us Last Read Size: 65536 Last Read Time: 2400us Total Bytes Read: 629512878 Max. Read Time: 305500us Raw Data MB/min: 424.14 Avg. Read Time: 6400us Backup Sets: 3950 Avg. Scan Time: 400us Avg. Open Time: 000us Total Read Time: 85s Avg. Close Tim...

recursion and memory use... a lot of memory use.
I have this program that recurses through a directory structure building Default.htm files. The idea here was to quickly generate a bunch of these things to replace the default index listing you normally see. The part that's giving me trouble is the navbar that should have a nested structure like... Food Fruit Apples Oranges Kiwi Vegetables Coffee Critters Cars Rocks So that if you are in /DocRoot/Food/Fruit you would see the above on the left all properly linked yada yada. The code I've pasted in does this just fine (I've removed a b...

To use or not to use Delphi
Sadly, it seems to me that there is a sort of race between the two threads, for and against using Delphi in new projects, with more or less the same users posting in both threads. Arguments are fiercely debated in both camps. Borland had their own vision. As a community, now that Delphi has changed ownership I believe we should try to be more consistent, more clear, and more articulate in what we expect from Embercadero in terms of Delphi. We can contribute to keeping Delphi alive and moving in the right direction. "Laurent Cocea" schrieb: > Sadly, it seems to me that there ...

Newbie: ClientDataSet, just as memory table
Hi, I want to use a ClientDataSet as a memory/local table, not connected to any Provider. I specified a FileName where to write its contents, yet when running the app, I get a run time error, asking for a Provider. Am I missing something? Thank you Guillermo Najar wrote: > I want to use a ClientDataSet as a memory/local table, not connected > to any Provider. I specified a FileName where to write its contents, > yet when running the app, I get a run time error, asking for a > Provider. Am I missing something? You should create all TFields, and instead of call Clien...

Poor performance using accounting software
I have an accounting package that uses Pervasive 2000 on a Netware 6.5SP2 server. The hardware is a Dell PE2650 dual-1.8GHz w/1 GB RAM and a RAID-10 18 GB array using the PERC 2 controller. I moved the accounting system from a Netware 4.11 server which ran on a PIII-550. I'm also using NSS. Now, processes that ran 30 minutes on the old server can take up to 6 hours on the new. I have reviewed performance tuning settings and cannot see what could be wrong. The software vendor (Oil and Gas Information Systems in Fort Worth) can only give general networking suggestions, but n...

Delphi 2007 massive memory use
I have a source file with about 40,000 lines in it. Every time I hit a key stroke BDS.exe's memory use goes up by about 12 MB! I have to exit and re-start Delphi every few minutes or it runs out of memory at about 1GB and dies. Is there any solution to this please? Based on my own recent *repeated* experiences... you may be creating a variable within a procedure and then not freeing it afterwards when no longer needed. There are probably many better ways, but what I do is put a visible field on my form and call this within whatever main loops you have, then watch the changes ...

bad performance using temporary tables
Hello everybody, I�ve got a really misterious problem: A common technique to pass "array-like" parameters to a stored proc is to create a temporary table, fill this table and call the stored proc (which expects this table to exist during run time) doing something with the given elements. I use this method very often - until I discovered the following: As most of you know, the name "#temptab" isn�t exactly the name of the table, but instead the server adds a postfix to the name - depending on the process (user, session) which created the table; so if you us...

Poor performance using wireless broadband
My environment: Windows XP SP2, Novell Client for Windows 4.91 SP2, wireless broadband showing 3Mbps in Task manager, on Lenovo(IBM) laptop When using internet, Lotus Notes, various other applications and transferring files, performance over wireless broadband is fast enough and, more or less, as expected. When I log into Novell, performance is disappointing. In particular, Windows explorer (which has network drives mapped) performs very slowly when traversing directories - task manager is indicating less than dialup speeds. My network support people say this is because of all...

Poor query performance when using parameters
I have a Group By query which runs well when using a criterium without parameter values. It takes less then 2 seconds to execute. CONVERT(DateTime,'09-01-02 00:00:00') However, when using parameters, it takes over 20 seconds to execute the same query. CONVERT(DateTime,@DateTimeFrom) Is there a way to 'tune-up' the performance while using parameters? --Goos Using ASE12.5 Index On DateTime Index On MessageClass,MessageNumber About 1,000,000 records '****************************************************************** DECLARE @DateTimeFrom CHAR...

Delphi 2010 - dbexpress
Hi I use the dbx driver for firebird named dbxfb.dll and in my application i do a select every seconds. the memory of the process increase every 30 minutes but when i use Fastmm to check the memory leaks : no memory leak. I have noticed that if i do not do the open and close primitive, no more memory leak !! Here is my source LQuery := TSQLQuery.Create(Self); LQuery.SQLConnection := SQLConnection; LQuery.CommandText := 'SELECT * FROM '+ TableName + ' where EventId = :EventId'; try LQuery.ParamByName('EventId').AsInteger := EventId; LQuery.Open; LResult...

Memory used up by a datset with one table in it?
I have a single table in a dataset that I create in my code. This table is also created in my code. The table is made up of 1000 records with 1500 bytes/record. HOW MUCH MEMORY WILL BE TAKEN UP BY SUCH A DATASET? I know that this memory is not going to be 1000 X 1500 = 1.5 MB. sun21170 I would recommend strongly against keeping a (minimum) 1.5 MB DataSet in RAM. Datasets have a reasonably high amount of overhead. What would you be doing with the Dataset? Can you just do your searching and sorting on the database and get smaller datasets when required?Starting with ASP.NET 2.0? Look at...

Memory Leak when using Proxy tables.
Hi, I'm using 6.02.2371 on WINNT-4.0 SP6. On a single server I have two database servers running, Databse-1 uses page size as 2048 and the other Database-2 uses page size as 4096. Database-2 has proxy tables which read data from database-1. Bith databases are working fine, problem comes when Database-2 tries to access/query lot of data throgh proxy tables, there is memory leak, and eventually whole memory is used up. Then we have to stop and start the database-2 and it comes back to its original memory use. Any ideas. Thanks for your help, Ashok. ---== Posted via the PFCGuide We...

Poor Performance with Small Table References
I have noticed something odd. I have a small table containing tax rates, It contains: Description - char(30) Rate - float Code - char(8) The code is the primary key, and the table currently has three records in it. If I issue a "SELECT * FROM Tax" within ISQL then I get an immediate response. However, if I issue a "SELECT * FROM Tax ORDER BY Code" then I get the first two records, an 8 second delay, then the remaining record. This is the same effect that I get when I run something similar from code. After some investigation, I've found that the tab...

Web resources about - ClientDataSet - Very poor Performance when used as in memory table?? - embarcadero.delphi.dbexpress

Determining the file format from the data
AWare Systems TechTalk 001 Determining the file format from the data > I have a either a .bmp, .jpg, and .gif file in a ClientDataset's Blob ...

FireMonkey Q&A
Questions and answers from a FireMonkey webinar

Resources last updated: 12/18/2015 11:11:44 AM