Append query takes forever to run!

I have a table containing 450,000 records (yes I know it's probably pushing the limits for Access) and need to perform an operation to summarise certain records and append them to another table, containing about 2,000-3,000 records.

Not surpirisingly it takes quite a while (5 hours in fact). I want to make sure that I have done everything I can to make sure this operation runs as fast and as smoothly as possible.

Therefore I am in the process of making the following changes to all of the tables used in the append query:

1. set up additional indexes.
2. change several 4 character string fields from strings to long integer values (they store things like year and id numbers etc).
3. set the 'Use Transaction' property in the append query to No.

I am hoping the above changes will make quite a dent on the time taken to perform this task. Can ayone suggest anything else that might help?

At present the table is sitting within the database, would it make any difference if the records were read from a flat file instead?

Also, is it likely to be any quicker if, instead of using an append query, I use VBA to write the new records?

Post your answer or comment

comments powered by Disqus
Windows 7, 64bit
Office 2010 Professional 32bit version 14.0.6112.5000

I have a button that I click that runs a delete query, then an append query, and then opens a report based on some parameters and the data in the table that the query appends to.
When I first open the database, the whole procedure takes a few seconds, which is acceptable. If I close the report and click the button again, it still takes a few seconds. But if I sort by any field in the report before closing the report, the next time I click the button the append query takes over a minute run. The code for the button is:
Private Sub btnOpenrptLotData_Click()DoCmd.OpenQuery "qryLotDataConglomerateDELETE"DoCmd.OpenQuery "qryLotDataConglomerateAPPEND"DoCmd.OpenReport "rptLotData", acViewReport, , "([qryLotDataConglomerateWithTotals].[FormattedDate] Between #" & VBA.Date & "# And #" & DateAdd("m", -1, VBA.Date) & "#) And ([FirstOfModel] Like ""*" & [VINCodeReportSortByUnitField] & """" & " Or [FirstOfModel] Like """ & [VINCodeReportSortByUnitField] & " *"")"End SubBut I don't think the problem resides there, because once the slow-down occurs, I can run the append query directly and it STILL takes over a minute to run. The SQL for the append query is:
INSERT INTO tblLotDataConglomerate ( RecordID, SeqNum, LotNum, VINCode, Model, FormattedDate, FormattedTime, UniqueID, SkippedVIN, RightVINCode7, FirstVINOfLot, LastVINOfLot, Quantity )SELECT qryLotDataWithUID.RecordID, qryLotDataWithUID.SeqNum, qryLotDataWithUID.LotNum, qryLotDataWithUID.VINCode, qryLotDataWithUID.Model, qryLotDataWithUID.FormattedDate, qryLotDataWithUID.FormattedTime, qryLotDataWithUID.UniqueID, tblSkippedVINSheets.SkippedVIN, Right([VINCode],7) AS RightVINCode7, tblReceivedVINSheets.FirstVINOfLot, tblReceivedVINSheets.LastVINOfLot, tblReceivedVINSheets.QuantityFROM (qryLotDataWithUID LEFT JOIN tblSkippedVINSheets ON qryLotDataWithUID.VINCode = tblSkippedVINSheets.SkippedVIN) LEFT JOIN tblReceivedVINSheets ON qryLotDataWithUID.UniqueID = tblReceivedVINSheets.UniqueLotIDORDER BY qryLotDataWithUID.FormattedDate DESC , qryLotDataWithUID.FormattedTime DESC;The query qryLotDataWithUID (which the append query is mostly based on) takes a little over a second to run both when the append query is running quickly and when it's running slowly.

Once I've Compact/Repair-ed, the problem goes away. Is there anything in the code that could be causing this problem, or does anyone have any ideas for changes or workarounds that might help fix it?

hi All,

Is there a size limitation on how many rows an Excel 2007 spreadsheet can have when importing to Access 2007? I have about 9300 rows of data in Excel(not big) and the eventual size will be much bigger. The file size is not even a 1000 kb.

But even now, it's taking forever to import just 9300 rows of Excel data. Geez. I did it through the import wizard as well as through the DoCmd.TransferSpreadSheet function, which normally works like a charm.

In fact, it's still hasn't completed. I killed the process the first few times. But then I Iit run to see how long it takes to import. The next day I came in it finally completed BUT I do not see the table. Something is wrong... Shouldn't take this long and the size isn't that big too.

Any ideas/suggestions?


So here's my setup. We have a database with several tables that have over 255 fields. So we must use pass-through queries to access some of the data.

Usually no problem.

Here is my pass-through query:

qry001: SELECT db001.table001.field001 from db001.table001

If I execute this, it instantly spits me to datasheet view with all the millions of records ready for viewing.

If I modify it to:

SELECT db001.table001.field001 from db001.table001 WHERE db001.table001.field001 = '123'

It works fine and displays the one record instantly.

Now on to the problem

When I make a new query, using this pass-through query as a data source such as:

SELECT qry001.field001 FROM tblLocal001 LEFT JOIN qry001 ON tblLocal.field001 = qry001.field001;

This query takes forever to run.

And the thing is, tblLocal001 only has one record in it. In production, this table will have thousands so I need to understand why this query is taking so long and fix it if possible.

Thanks for your help

I have another strange problem
My corporation has 13000 employees
I maintain data on about 500 that are qualified to do a certain task.
I finally got access to the personal data on ALL employees, but my queries take FOREVER to run on all 13000 records. Can I create an update to simply pull the data for my 500 into my own data base, while not loosing the 'other' data I have for those records.
Company DB has 140 fields, I only need the data from 9-10
my DB has 30 fields.
Just pretend its guys updating there phone number with HR, but they are not letting me know, so I need to pull from HRs data and update mine, but I want to do it with one simple query, or form.. Anyone know how I would go about this?

Hey guys.

If someone can find the time would they please have a look at the database I have attached here?

What it does is calculate average cases shipped for each day of the week from 4 tables for each day which contain an average or 15,000 records each. There are queries that also has to calculate what we call master cases for each day because some things are shipped as a master case or the master case is opened and individual units are shipped (This is identified as TRUE in the 'Break' field of the transactions tables).

In a nut shell I want to calculate the averages for each day for all products in the t_ItemSearch table (always has about 35,000 records) and put it into the t_CombinedVelocityTotals table for use in another database.

Everything seems to work properly but when I run the q_AllItemsWeeklyAverage append query, it takes forever to add the records to the table. By the way, I delete all records in the t_CombinedVelocityTotals table each time before I run the query.

I would really appreciate someone's help on this. Thanks.

I have an append query from an SQL Server db to a local table, three fields, one key field in each, and 20,000 records takes about 30 minutes, with the database alternating between not responding and running every 5 seconds. . . . . my XP machine would have this done in less than 1 minute

my XP machine has a broken video card at the moment,

Windows 7 x64 Office 2007, ODBC SQL2008 8 G RAM, 200 G free disk space, high end Dell laptop, excel and outlook open as well. (doesn't affect performance as overnight download never finished as well.)

what am i missing?

is There a security setting in McAfee which is causing some sort of virus checking? is there a service which is stopped which would help?

thanks in advance. . .


I'm having difficulties deciding on how to go about saving past information, but I believe an append query is the route to go. What I'm trying to do is once a project's information has been issued then the append query takes those line items from tbl1 and dumps them in tblRevA. So only the projects that have been rev'd up will be appended to the table. Now when tbl1 has info going from RevA to RevB it takes the info from tbl1 and appends the B status info to tblRevB. Was curious if there was a way to code criteria for a Rev Field in the Append Query to look at the Rev column and dump data based on the Rev it is.

Note: tbl1 will always have the most up to date information. the other Rev tables are for comparison so a report can be generated to show the changes made by comparing the different rev's

I am not sure if there is a forum for Access 2010 they updated my office to 2010, I do not have 2003 any more on my system.
I am using Access 2010, I have been told that it is the same as Access 2007. I know nothing about Sql so I am stumbling here. I have created my database and tables with a PK and FK’s
My Main Table EMPLOYEE, has a PK EMP_ID, the other tables have it as FK
I can add records to the EMPLOYEE table through a Data Entry Form with no problem
There will be no record deletion at all in the database If any record is edited my boss wants it to write a new record to the table in case if there is a dispute over any information the Original record can then be retrieved and compared to the new edited record.
I’m trying to Edit EMPLOYEE data through a form and once the information is changed in the form have the command button run the append query to write the new record to the end of the table
I have 6 test records in my EMPLOYEE table, on the form I have a Combo box located in the header to allow the supervisors and up to press the down-arrow and select the employee name, then the form is populated with the employee information, in the detail area. If they change Emergency Contact name I need to be able to write a new record with current information and the new emergency contact information.
When I make that change in the form and run the Append Query it wants to Append all 6 test records to the table. All information is being pulled from this one table.
I do not know what I’m doing wrong, and can you point me in the right direction.
Here is a copy of my append query. Qry name QryEditEmp
Thank you All. ( I have a Word Doc with a listing of all tables and creation info if that will help)

I have an append query that everytime it runs it will simply duplicate the records. ex.. sample table has 11 records if you run it again the tableB will now have 22 records. I need it to only update the records, so if someone makes in change / new record in tableA it will only update the records in TableB.
I have the Primary Key set to Yes no duplicates.
TableA Primary Key is ID
TableB Primary Key is CustomerID

Thank you,


SELECT FCST.REF_DT AS [Date Forecasted], ACTUAL.REF_DT AS [Date Shipped], FCST.QT AS [Qt Forecasted], ACTUAL.QT AS [Qt Shipped], FCST.PART_NR, FCST.REGION, FCST.TYPE, IIf(FCST.QT=0,'',formatpercent((ACTUAL.QT-FCST.QT)/FCST.QT,2)) AS MPE





i am querying two tables TBL_FCST_WKLY which has 29500 records and TBL_ACTUAL_WKLY which has 798222 records.
When i run this query it sites there forever. Is it because i dint optimize my query or is it because the number of records is too much for access? can someone help plz.


A Brief background

I have a Projects database to create and track new product development, BE on the server, FE as an MDE application for users

I have the following setup within the FE
A Form to enter project specifics, i.e customer, product etc Append query on save to create a separate task list related to the above project via Product ID, from a set predefined list of tasks held in a table This works fine on a single machine but when scaled and used on more machines simultaneously I am getting duplicate task entries for one product ID (Unfortunately this happened during the first training session )

Dependent on the number of machines open and entries made at approximately the same time defines the number of repetitions in the tasks I am seeing per product ID

i.e 3 x FE running, 3 x Product records entered at approximately the same time = 3 x each task enter for each ID

I could get round this by creating a delete query to remove the duplication but would rather understand how, why and fix said problem at the source

Any help would be greatley appreciated

So, I've been searching through this forum and can't seem to find the answer to this one. I would like to capture a value from a main form and have it used as a value in an append query, in order to populate a subform based on the main form, like so:

INSERT INTO tblTakenSurveys ( VisitID, SurveyQuestionID, ResponseCodeID) SELECT Forms![frmMyFormName]!VisitID, tblSurveyQuestions.SurveyQuestionID, 66
FROM SurveyQuestions WHERE SurveyID = 3;

Might help to explain some of the terms in this statement:tblTakenSurveys is where I need the new data to be entered via the subform. Forms![frmMyFormName]!VisitID is a textbox control bound to a PK in another table that has a one-to-many relationship with tblTakenSurveys. tblTakenSurveys.ResponseCodeID is a foreign key that represents respondents actual answers to questions. 66 is a value for a ResponseCodeID that stands for a dummy value meaning "data not yet entered"
As per advice I received from others on this forum, I have set a query like the above to run from a command button to populate the subform (in theory). But I'm sure I've done something wrong within the query because it will not return a value from the form "VisitID" control and therefore will not append the rows. Without the appended rows, my subform will not populate. And this has me running in circles...

I pasted the link to another thread below, where I originally received a lot of input as far as the table structure. I did not start this one, but my posts are the most recent (as of now anyway). Pat Hartman had given me a lot of the guidance here.


I have a macro where I am running queries against several linked text files to append specific data from the linked text files to tables in my application. All the associated queries used to update my tables are in a macro. When I run the macro, all applicable data from the linked text files populate correctly in my application tables. In addition, if I run the queries individually outside of the macro, I achieve the same results.

When I attempt to run the macros from within my main form via the Docmd.RunMacros command, I receive the following error:Run-time error '3265':
The text file specification 'ALV_GRC_PC_Structure' does not exist. You cannot import, export, or link using the specification.
I have also tried converting the macro's to Visual Basic modules and have tried relinking each Text file just prior to run my first Append query. Nothing seems to be working. Any helpthoughts would be greatly appreciated.

Ok i have s series of queries in need to run multiple times, they need to run for a set time period.
The queries are running calculations for staffing numbers, the numbers are updated and evaluated each month for a set number of months.
the time line is set during form open from a table. An example of the queries that need to run are:

DoCmd.OpenQuery "qryPopRedSavN1u", acViewNormal, acEdit
DoCmd.OpenQuery "qryPopRedSavN2u", acViewNormal, acEdit
DoCmd.OpenQuery "qryPopRedSavN1au", acViewNormal, acEdit
DoCmd.OpenQuery "SetVacNoActivityNeg", acViewNormal, acEdit
DoCmd.OpenQuery "qrySetVacNoActivity", acViewNormal, acEdit
'DoCmd.RunSQL strSQL18
'insert setvacnoact query
DoCmd.OpenQuery "qrySetVacMonth", acViewNormal, acEdit
Any help how to set up to run with loop to run for the required number of months will be much help

Hi all,

I have a problem which i hope someone may be able to help me with.

Basically my users currently look at their sales statistics by looking at a query based on the table where they enter the records from.

As the sales figures change, the values in the table change, because the table that they actually see, is run off a query which just average the sales to date....confusing i know.

Basically my users have asked me for an append query, they want to be able to click on a button and then append the data which they can currently see into a table, so for example if they clicked the button (appended the query) last week and then today, they can see the difference, whether good or bad over time.

My problem is that i am unsure how to record the date that the users click on the button on each of the records they append, ie so when i do my report they will be separated by weeks and the user can see the clear difference.

I would appreciate any help, or pointers in the right direction.

If any of you could help me on this, I really appreciate for it.
I have a form (it includes several subforms but it is not important). Using Filter in toolbar I can narrow my data set.
I got a query based on the same tables with this form. I just need to link somehow form and query.

I want to run the query (which has a common field in the form) on basis of filtered form recordset.


Long time listener, first time caller.

I have a database that is split into two files. Let's call them:
MyDatabase.accdb MyDatabase_be.accdb MyDatabase.accdb contains the functionality (forms, queries, reports, etc), and MyDatabase_be.accdb contains all of the tables and data that are imported to MyDatabase.accdb via a link.

One table (CycleCounts) has a field "PickLoc," which is a product's picking location, and is scraped from another source. The first three digits of this PickLoc are the bin number, followed by shelf number, then lane#.

We have to inventory these products one bin at a time, and I need to be sure that all of the SKUs in a bin have been counted before going to the next bin. So, essentially, if we count anything from a bin, we must count *everything* in that bin.

These counts are in a separate table (ShelfCounts) with a many-to-one relationship.

In order to check for missing products, I had to do a query first to get which bins were done on which date (this query is called "BayByDate"):

	PARAMETERS vrDate DateTime;
SELECT DISTINCT CycleCounts.CC_Date, Left ([CycleCounts].[PickLoc],3) AS Bin
FROM CycleCounts INNER JOIN ShelfCounts ON CycleCounts.CC_ID=ShelfCounts.CC_ID
WHERE CycleCounts.CC_Date=[vrDate];

Then to check for missing products, I have this query (called "MissingProds"):

	PARAMETERS vrDate DateTime;
SELECT CycleCounts.SKU, CycleCounts.PickLoc, CycleCounts.CC_Date
FROM CycleCounts
WHERE (((Left([CycleCounts].[PickLoc],3)) In 
      (Select [BayByDate].[Bin] FROM [BayByDate];)) 
   AND ((CycleCounts.CC_ID) Not In 
      (SELECT CC_ID FROM [ShelfCounts];)))
GROUP BY CycleCounts.SKU, CycleCounts.PickLoc, CycleCounts.CC_Date
HAVING (((CycleCounts.CC_Date)=[vrDate]))
ORDER BY CycleCounts.CC_1stAFFLOW;

Essentially, if the bin number of the product is in the BayByDate report, but the CC_ID isn't in the ShelfCount table, select it.

The problem is that this query takes forever, sometimes a half hour or more, and this is only going to get worse as more and more records get added to the CycleCounts and ShelfCounts tables. Any ideas on how to speed this up?

Any help is appreciated.

Dear Access gurus,

I need to append data to existing records. I created an append query which failed due to key violations. The problem is this. New data fields were added to existing records. I need to append new data to the new data fields of the existing records. However, this may be the cause of the key violation because of the violation of primary key. Is my understanding correct?

Has anyone encountered similar problem before? Did you find any solution? Hope the experienced experts here can give some ideas. Thank you very much.

I don't know what the problem is with this:
I have two tables, one is a billing table and one is a commission
table. I need to create queries/reports with this data.

Each have a field "order number" and "line number"; the order number may be on each table multiple times, with different line numbers (order lines).

In order to match these up, I created two queries from each table with a field called "Ordermatch" which combines the order number/line number fields. "Linematch:[order number] & [line]"

The individual queries run just fine, but when I try to match them up in the main query to run data I get "recordset not updatable" in the status line, and it takes forever to run...then some data shows up but then hangs.

I didn't create this database...checked the tables and changed the "order number" field to indexed/duplicates OK. Changed
the line number (which was double) to text as Order number field is. I had done a calculation in the underlying order table, even
taking these out and just joining the two tables with nothing causes it to bomb. HELP!

I have checked the Access performance for tips but there were no problems found.

SQL for query making the join:
SELECT qryMBProofBase.[Sales Area], qryMBProofBase.[Employee Number], qryMBProofBase.[Business Partner], qryMBProofBase.[Order Number], qryMBProofBase.Line, qryMBProofBase.Item, qryMBProofBase.[Product Line], qryMBProofBase.[PL Sales], qryMBProofBase.[PL Cost], [PL sales]-[pl cost] AS [GP$], ([PL Sales]-[PL Cost])/[PL sales] AS [GP%], qryMBCommbase.Com_Sales, qryMBCommbase.[Com Amount], qryMBCommbase.[Com %]
FROM qryMBCommbase RIGHT JOIN qryMBProofBase ON qryMBCommbase.Linematch = qryMBProofBase.Linematch;

Iv'e been using an append qry to update new entries taken from our odbc server to a table in Access

It seems to work fine early on, but later on in the year when the records from the odbc get bigger, it takes longer to run----can take 5 min+ to run

I did try changing it to a Passthrough qry but it does not work if it's based on an append qry

Is there a better way?

HAVING (((dbo_F_TASKS.TA_TASK_ID) Like "0293*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0294*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0295*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0296*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0297*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0298*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0299*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0300*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0301*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0302*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0303*" Or (dbo_F_TASKS.TA_TASK_ID) Like "0305*"))


I am currently working an a database that was originally developed by someone else. It is a bit of a complicated process but I'll explain the best I can.
Data is exported to an export folder from a third party program once a day.In the database there is a macro constantly running waiting for the data to be exported to the folder. Once the data is found the data is then used in an append query and the data in the export folder is killed.
The append query then puts the data into a table. The table is the used by various select queries to produce reports.
The problem is the data that is originally exported is fine.The data when it sits in the export folder is also fine.Once the data is put into the table from the append query the data is wrong.
I'm assuming that the problem is with the append query I cannot see what data is in the append query because until the export folders appear there is no data to see, but as soon as the query see's the export folders it runs and puts the data in the table. What I need to do is halt the append query whilst it is running.
Is this possible and if so how do I do it?
Sorry it is a bit long winded.
Thanks in advance.


I have a macro that runs a 'make table' query followed by a series of 'append queries' each time the macro is called. I require the data from each append query to be pasted below the existing data already created by the original make table query and the append queries that have already run. The issue that I am having is that the data in fact appears to be appended in an unpredictable way (ie sometimes I run the macro and find that data from the append queries is below the data from the make table query, at other times I run the macro and find that data from the append queries appears above the data from the make table query. Unfortunately, this causes problems with downstream queries, which rely upon finding the last row of data subject to various filters. Can anyone help me understand how Access determines the sorting/positioning of new data appended to an existing table, and therefore resolve this issue?

Thanks in advance,


Can a Append Query move all my data stored in multiple tables to another database with a identical table structure?

Because as I try to work the query, I keep getting prompted to "Select a table" I want to append to, and I don't want to append to just one table...

Hi all, been a while since I asked for help but it's that time again!

I working on an append query. The query is creating a record in tblRecords for each combination of Partner, Week and Genre. So if we had Partners A, B and C who use Genres 1 and 2 and we have 52 weeks, that would result in 312 new records. This is all well and good, but each Partner has a defined start week and year and an end week and year. I want the query to only append records on and between those dates. So if Partner A would start on week 27, 2008 and end on week 52, 2008, I would need this to be taken into account.

How could I achieve this? I know I can get parameters from a form (which is where this query is laucnhed from), use global variables and such but I don't know how to actually implement this. Maybe there is an easy way?

Each partner record has the week and year data stored within it.


Not finding an answer? Try a Google search.