When we need to insert data(table) to database (NO RAC and RAC) faster...,
we should have disk i/o faster. that's a good idea (example: solid state or make raid 1+0)


Anyway What should we do on Oracle Database?

- insert nologging
- No Archivelog
- No index
- Use big block size(example 16k, 32k)
- Concern about redo log, if use logging

What about your idea to improve this case?

Views: 5327

Reply to This

Replies to This Discussion

Hi ,

im using "insert /*+ APPEND */" hint option and its good as comapre to others.

if i insert with sub query, append hint, that is good.

It's not good with INSERT INTO XXX VALUES (.....,...)

for x in 1 ..100
insert into t01 values(x, 100);
end loop;

Elapsed: 00:00:00.16

for x in 1 ..100
insert /*+ append */ into t01 values(x, 100);
end loop;

Elapsed: 00:00:01.15
Hi Surachart,

Well there are many answers and my experiences have shown that it always depends on the use-cases.

What you can do to make inserts faster:

1) Bulking (if you want to make it fast, do bulking!) which reduces commits and commit time
2) Enough ITL on the block (INITRANS)
3) Depending on the size of a row: Blocksize, PCTFREE
4) No Indexing
5) No integrity checks (Foreign keys, not null constraints)
6) Buffer cache (if it's to small you'll early end up with I/O)

And the list goes on....

APPEND hint is also very useful on inserts because it's using DIRECT-PATH inserts which will append the data simply on the table. But be aware the unused space by the table will then not be filled up!

Also not forget paritioning!


Hi Surachart,

Gerald made an excellent exposition and I totally agree withg him.
From my personal experience I would recommend partitioning. But my experience is in the telecom world were we could easily make 20/30 million inserts per hour and some of these tables were partitioned by (day,hour) to improve insert speed, we had indexes and we could not use /*+append*/ because of backup restrictions.
If you need real insight or your specific case, may be you could post your full scenario.

I suggest before any guesses first you take an 10046 Level 8 SQL Trace and analyse it with tkprof to see where you spend the time with your inserts.

ps: If there are triggers they also might be a problem area.
Hi, do you have a column in the table to store the radius server id (i.ex server1, server2, ...)?
If so, is it indexed? Is it partitioned?
maybe you could consider partitioning the table according to the server. For instance
if table looks like this
c1, c2, radius_server_id, ...

You could use range or list partition by radious_server and the on each server, when you issue the insert statement you could specify the corresponding server id.

create table abc (c1 number,c2 number, radius_server varchar2(14)) partition by list (radius_server) (
PARTITION s1_server VALUES ('s1') ,
PARTITION s2_server VALUES ('s2') )

insert into abc partition (s2_server) values(1,2,'s2') ;
If the table is indexed by the server_id column, then using partitioning on that table and converting the index to local would produce even better results on insert.
Thank You for your suggestion. that's a good idea.

Actually this table has kept about account, nas ip, etc... when users log on...
I have 4 indexes. I plan to recreate hash partition, But I think hash can not to improve.

Perhaps I might recreate with range(month) partition.

About list partition, I have no idea to choose about column... now.

Do you have any idea?
If your planning to rebuilt table and add partitions, i don't think hash would do the trick on this one...
I stand by adding a range or list partition using a column server_id and forcing each server to specify its own partition while doing the insert.

insert into abc partition (s2_server) values(1,2,'s2') ;
thank you.

And? sql statement, block size, init_trans, pct_free and freelists...

How do you think? Any Suggestion about them?



Oracle Community On

I'm not a fan of advertising, and so I will not be including any advertisements on OracleCommunity.net. However, managing this community does not come without cost! If you are willing to donate to help pay for the monthly community fees and domain services I accept Bitcoin and PayPal donations.

Donate Bitcoins



Blog Posts

Looking to Publish Scientific, Technical and Medical Articles? Take a look!

Digital publishers who are involved in publishing Scientific, Technical and Medical (STM) content follow a very rigorous routine of editing, formatting, and publishing. Primarily, the content is meticulously scanned for grammatical errors and logical inequalities. Then a final draft is made in adherence to the guidelines set by the publishing platform and forwarded to TO BE PUBLISHED’ folder. All articles, papers, and…


Posted by Alvaro Dee on October 21, 2016 at 3:29am

Apple Opens Its First iOS App Development Centre in Italy

Apple aims at making things rounder and sleeker. With iPhone 7 out in the field, Apple is now investing its time and resources in organizing workshops and opening new iOS app development centers to educate people in advanced methods of iOS app building.

Apple’s highlight of the year? It’s first-ever iOS development Centre in Naples!…


Posted by Alvaro Dee on October 20, 2016 at 12:33am

Outsource Your Data Conversion Services to Get an Edge over Your Competitors

The process of converting data from one standard format to any desired file format, Data conversion makes the data more precise, convenient, and retrievable, while keeping the core data unchanged. The precious asset to an organization, data helps in effective decision making if digitized, organized, and managed properly. Whereas the inefficient management of data makes it a burden on the business enterprise.

If the data is structured and digitized efficiently, it can help in achieving…


Posted by Alvaro Dee on October 19, 2016 at 5:54am

Data Mining Can Help Your Business Get a Competitive Edge

Data mining services are a vital part of business growth. Data mining, sometimes referred as knowledge discovery in databases, is the process of digging through data to discover hidden connections, patterns, and anomalies so that future trends can be predicted. The mined data can help businesses in cutting costs, increasing revenues, reducing risks, improving customer relationships, and more.

The diversity and volume of data that is being captured by companies today is staggering. The…


Posted by Alvaro Dee on September 29, 2016 at 2:17am

© 2016   Created by Steve Karam.   Powered by

Badges  |  Report an Issue  |  Terms of Service