Quantcast
Channel: SCN: Message List
Viewing all articles
Browse latest Browse all 3316

Re: Chasing documentation for Smart data access

$
0
0

Thanks for help will try the internal forums for the expanded problem.

 

Basically we have a number of tables which represent user groups, user hierarchies and user info for the purpose of security.

 

We need to keep these in sync between two systems (HANA/Oracle).

 

History:

Original solution was to replicate the data however the source system truncates and rebuilds the tables quite rapidly overloading the replication system and leading to periods of them being out of sync

 

So we have decided to trial the smart data access system by having those tables in Oracle virtualized and HANA just query direct. Initial testing is showing the without join relocation turned on the performance is horrible and no wonder it is pulling tonnes of records back from Oracle.

 

Turning join_relocation on there is a bit jump in performance specially around memory usage. However there is a new problem in that it creates TEMP tables so immediately needs much higher privileges but more importantly it doesn't seem to get the data types right all the time.

 

Basically it seems to quite intelligently identify hey I can bundle up some of the non virtualized table's results remotely push that to Oracle to filter the results by before bringing them back improveing performance however we are getting

 

SAP DBTech JDBC: [403]: internal error: Failed to execute create and insert statement: [Oracle][ODBC][Ora]ORA-01727: numeric precision specifier is out of range (1 to 38)

 

and on inspection of the table it is creating see below it doesn't always get the datatypes right.

65535 --> max value for 16bits

Oracle being run was retrieved from smart data access in hana studio not Oracle auditing

 

CREATE GLOBAL TEMPORARY TABLE JRT_1_0X7FD550F57490 (

                "C1" number(38, 0)

                ,"C2" TIMESTAMP

                ,"C3" TIMESTAMP

                ,"C4" number(38, 0)

                ,"C5" nvarchar2(512)

                ,"C6" nvarchar2(512)

                ,"C7" number(38, 0)

                ,"C8" number(38, 0)

                ,"C9" nvarchar2(100)

                ,"C10" TIMESTAMP

                ,"C11" TIMESTAMP

                ,"C12" number(38, 0)

                ,"C13" nvarchar2(1)

                ,"C14" number(38, 0)

                ,"C15" number(38, 0)

                ,"C16" TIMESTAMP

                ,"C17" TIMESTAMP

                ,"C18" NCHAR(1024)

                ,"C19" nvarchar2(100)

                ,"C20" number(38, 0)

                ,"C21" nvarchar2(100)

                ,"C22" nvarchar2(1024)

                ,"C23" nvarchar2(128)

                ,"C24" nvarchar2(128)

                ,"C25" TIMESTAMP

                ,"C26" TIMESTAMP

                ,"C27" nvarchar2(1024)

                ,"C28" nvarchar2(1024)

                ,"C29" nvarchar2(1024)

                ,"C30" nvarchar2(1024)

                ,"C31" number(65535, 0)



So basically trying to get my head around how those dials affect when it chooses start pushing data remotely for the purposes of filtering

1. Always push so I can try and isolate which column/datatype it can't translate

2. Only push filters directly on the virtual tables to avoid the problem.


Viewing all articles
Browse latest Browse all 3316

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>