Well there are many answers and my experiences have shown that it always depends on the use-cases.
What you can do to make inserts faster:
1) Bulking (if you want to make it fast, do bulking!) which reduces commits and commit time
2) Enough ITL on the block (INITRANS)
3) Depending on the size of a row: Blocksize, PCTFREE
4) No Indexing
5) No integrity checks (Foreign keys, not null constraints)
6) Buffer cache (if it's to small you'll early end up with I/O)
And the list goes on....
APPEND hint is also very useful on inserts because it's using DIRECT-PATH inserts which will append the data simply on the table. But be aware the unused space by the table will then not be filled up!
Gerald made an excellent exposition and I totally agree withg him.
From my personal experience I would recommend partitioning. But my experience is in the telecom world were we could easily make 20/30 million inserts per hour and some of these tables were partitioned by (day,hour) to improve insert speed, we had indexes and we could not use /*+append*/ because of backup restrictions.
If you need real insight or your specific case, may be you could post your full scenario.
Hi, do you have a column in the table to store the radius server id (i.ex server1, server2, ...)?
If so, is it indexed? Is it partitioned?
maybe you could consider partitioning the table according to the server. For instance
if table looks like this
c1, c2, radius_server_id, ...
You could use range or list partition by radious_server and the on each server, when you issue the insert statement you could specify the corresponding server id.
If your planning to rebuilt table and add partitions, i don't think hash would do the trick on this one...
I stand by adding a range or list partition using a column server_id and forcing each server to specify its own partition while doing the insert.
insert into abc partition (s2_server) values(1,2,'s2') ;
I'm not a fan of advertising, and so I will not be including any advertisements on OracleCommunity.net. However, managing this community does not come without cost! If you are willing to donate to help pay for the monthly community fees and domain services I accept Bitcoin and PayPal donations.
Digital publishers who are involved in publishing Scientific, Technical and Medical (STM) content follow a very rigorous routine of editing, formatting, and publishing. Primarily, the content is meticulously scanned for grammatical errors and logical inequalities. Then a final draft is made in adherence to the guidelines set by the publishing platform and forwarded to ‘TO BE PUBLISHED’ folder. All articles, papers, and…
Apple aims at making things rounder and sleeker. With iPhone 7 out in the field, Apple is now investing its time and resources in organizing workshops and opening new iOS app development centers to educate people in advanced methods of iOS app building.
Apple’s highlight of the year? It’s first-ever iOS development Centre in Naples!…
The process of converting data from one standard format to any desired file format, Data conversion makes the data more precise, convenient, and retrievable, while keeping the core data unchanged. The precious asset to an organization, data helps in effective decision making if digitized, organized, and managed properly. Whereas the inefficient management of data makes it a burden on the business enterprise.
If the data is structured and digitized efficiently, it can help in achieving…
Data mining services are a vital part of business growth. Data mining, sometimes referred as knowledge discovery in databases, is the process of digging through data to discover hidden connections, patterns, and anomalies so that future trends can be predicted. The mined data can help businesses in cutting costs, increasing revenues, reducing risks, improving customer relationships, and more.
The diversity and volume of data that is being captured by companies today is staggering. The…