Reset sequence after data import
Hi all,
I've got a problem where we import data into a table with an auto-incremented field for a primary key. After the data import, I can drop the sequence, check the max(primary key), and re-create it to start at the max value + 1. My problem is that I need to do this for a large number of tables and don't know how to write a script to do it. In the Create Sequence command, whenever I try to use a variable in the "START WITH" clause, I get an error.
Thanks in advance,
Raymond
Spool sequence creation scipt result in a file and then run it.
Or use dynamic sql.
Or You can "drive" sequence forward issuing select myseq.nextval from dual; apropriate times. If You need "drive" sequence backwards alter it with increment -1, issue select myseq.nextval from dual; apropriate times and alter it with previous increment.
Similar Messages
-
Autonumber in a Sequence after Export/Import
Hi,
I have a sequence running in a DB, which increments an ID field. The problem comes up, when I need to backup/restore the DB. What I need then is a SQL script, which would allow me to set the MinValue to the max used value +1 (or + 100). Unfortunately something like
Create Sequence XXX.Mytable
Increment by 1
MinValue (SELECT MAX(COUNTERID) from XXX.Mytable);
MaxValue 1E+27
Cache 20;
does not work. So what is best practise to handle this case. I found many people posting this question, but no simple answer. Does anybody have one?
Thank you in advance
PawelHi Pawel,
I would suggest creating a SQL Procedure and construct a string that would create the sequence and then use EXECUTE IMMEDIATE to create the sequence. Something like:
DECLARE
vMaxID NUMBER(10,0);
vSQL VARCHAR2(1000);
BEGIN
SELECT MAX(COUNTERID) INTO vMaxID FROM MyTable;
vSQL := "CREATE SEQUENCE COUNTERID_SEQ MINVALUE " || TO_CHAR(vMaxID) || " MAXVALUE 99999999999999999999999 INCREMENT BY 1 START WITH " || TO_CHAR(vMaxID) || " NOCACHE NOORDER NOCYCLE;
EXECUTE IMMEDIATE vSQL;
END;
Regards
Andy -
How to reset site after data transfer limit is exceeded
hi everyone
i wonder if you can help me, i exceeded my data tranfer limit on the 9th of the month, as i published a new site many people viewed it and the limit was exceeded quickly. now it is the 16th and the site is still unavailable. why is this? can anybody help me sort this out? its just a few photos and one small video. its rather frustrating,
thanks very much
mac oX Mac OS X (10.4.7)
mac oX Mac OS X (10.4.7)Sounds like a question for Apple/.Mac support???
-
Problem with sequences after import
We recently imported some table data from QA to DEV and now the DEV database has issues with sequences errors. How can I resolve the problem with sequences since we have constraint violation errors as a result of the table sequences being out of sync after the import of several schema tables.
We run a script like the following to get the sequences as close as possible after importing from PROD to DEV:
set serveroutput on;
DECLARE
CURSOR CUR IS
SELECT otherdb_seq.sequence_owner AS owner,
otherdb_seq.sequence_name AS seq_nm,
TO_CHAR(otherdb_seq.last_number) AS start_nr
FROM dba_sequences currdb_seq,
dba_sequences@prod otherdb_seq
WHERE currdb_seq.sequence_owner IN ('~schemaname~') AND
currdb_seq.sequence_owner = otherdb_seq.sequence_owner AND
currdb_seq.SEQUENCE_NAME = otherdb_seq.sequence_name
ORDER BY 1, 2;
sql_string VARCHAR2(500);
BEGIN
FOR REC IN CUR
LOOP
sql_string := 'DROP SEQUENCE '||REC.owner||'.'||REC.seq_nm;
DBMS_OUTPUT.PUT_LINE(sql_string||';');
EXECUTE IMMEDIATE sql_string;
sql_string := 'CREATE SEQUENCE '||REC.owner||'.'||REC.seq_nm;
sql_string := sql_string ||' START WITH '||REC.start_nr;
sql_string := sql_string ||' MAXVALUE 999999999999999999999999999';
sql_string := sql_string ||' MINVALUE '||REC.start_nr;
sql_string := sql_string ||' NOCYCLE NOCACHE ORDER';
DBMS_OUTPUT.PUT_LINE(sql_string||';');
EXECUTE IMMEDIATE sql_string;
sql_string := 'DROP PUBLIC SYNONYM '||REC.seq_nm;
DBMS_OUTPUT.PUT_LINE(sql_string||';');
EXECUTE IMMEDIATE sql_string;
sql_string := 'CREATE PUBLIC SYNONYM '||REC.seq_nm||' FOR '||REC.owner||'.'||REC.seq_nm;
DBMS_OUTPUT.PUT_LINE(sql_string||';');
EXECUTE IMMEDIATE sql_string;
sql_string := 'GRANT SELECT ON '||REC.owner||'.'||REC.seq_nm||' TO PUBLIC';
DBMS_OUTPUT.PUT_LINE(sql_string||';');
EXECUTE IMMEDIATE sql_string;
END LOOP;
END; -
Sequence PK generation is messed up after table import
Thought I'd post this here where the possibility of direct object manipulation is what it needed.
Recently created new 10.2.0.1.0 database on new machine as a copy of one running on another machine. Installed Apex 2.2 and used the data import feature within to transport tables from the one machine to the new one. All tables were imported exactly as they were in the original DB. Primary Keys and column types are set exactly the same as the orig along with exact same AUTOID_PK column definition. New sequence was created and associated with correct PK column as in orig. Data import was all successful.
Then imported the associated Apex Application from orig DB, exported as 2.2 into new 2.2 Apex.
Problem occurs now when attempting to add new records, getting a PK violation apparently due to duplicate keys being created at insertion time from the PK's associated sequence.
I've done this same procedure in the past and not had this problem.
Using SQL Developer to look at the associated PK index I see that the index sees 205 distinct keys, 205 rows, and a sample size of 205.
++++ Does the problem stem from the next sequence value being generated is 206 ???? +++
I'm not a pro with the functionality of Oracle as you may be able to tell.
The duplicate AUTOID value may come from the fact that although there are only 205 records and 205 distinct different AUTOID values, the values themselves do not run from 1 to 205 since there have been many deletions and additions. The highest value in the AUTOID field is 376, and I would assume the sequence needs to start generating the next value from 377 forward so as not to cause this PK violation.
That is just a deduction rather than possible fact, can anyone help with identifying the problem if I'm wrong, and can I get some help correcting this so I can continue on from where I left off?
This is a problem in 2 tables at the moment, shouldn't be any more.
Thanks to anyone who can help.
JacobIf I understood you correctly, sequences were precreated on target db before running import. If so, that is where problem was created. Oracle exports sequence definition not with original START WITH value but rather with START WITH last_number. This ensures imported sequence will not generate already generated values. You, most likely, precreated sequences with START WITH 1 and used imp with IGNORE=Y. As a result sequences started to generate duplicate values.
SY. -
I just sold my iPhone 4s after doing a reset and deleting data and settings. Is there anyway the buyer can see any of my old photos, texts, or data? I have confidential info for work and family photos and freaked out that they can be found somehow.
No, but if the answer was yes it's a little late to start getting freaked out about it.
-
The Sequence/order in which Data Import should be done?
Hi experts,
Can anybody suggest the sequance/ order in which we should do the data import using DTW, while setting up a fresh company, so that we can avoid dependancy conflicts?
Is there any document available?
Thanks and regards
Ajith GHai!
Check List For Data Migration
Sl.No. Module Data
1 Finance Chart of Accounts
2 Finance Validating Cash Account, Control Account, Multi-Currency properties for G/L Accounts, Account Type, Relevant to Budget, Exchange Rate Differences, Rate Conversion, etc.,.
3 Finance Profit Centres
4 Finance Distribution Rules
5 Finance Mapping Distribution Rules to G/L Accounts
6 Finance Projects
7 Finance Mapping G/L Accounts to Projects
8 Finance Currencies
9 Finance G/L Account Determination
10 Finance Item Groups Accounting
11 Finance Warehouse Accounting
12 Finance Item level Accounting
13 Finance Tax Codes
14 Finance Plant Wise Tax Information
15 Finance Freight Types (Additional Expenses)
16 Banking Banks
17 Banking HouseBanks
18 Banking Payment Methods for PaymentWizard
19 Inventory Locations
20 Inventory Manufacturers
21 Inventory Package Types
22 Inventory Inventory Cycles
23 Inventory Item Groups
24 Inventory Item Properties
25 Inventory Warehouses
26 Inventory Define Default Warehouse
27 Inventory Excise Chapter Information
28 Inventory Items - (Attributes ex.: PurchaseUOM, InventoryUOM, SalesUOM, Conversion Factors)
29 Inventory Items - Labor & Travel Type
30 Inventory Customs Groups
31 Inventory ShippingTypes
32 Inventory LengthMeasures
33 Inventory WarehouseLocations
34 Inventory WeightMeasures (Volume)
35 Inventory PriceLists
36 Inventory BP Catalogue Numbers
37 Business Partners Customer Groups
38 Business Partners Vendor Groups
39 Business Partners BP Properties
40 Business Partners States
41 Business Partners Payment Terms
42 Business Partners Sales Employees
43 Business Partners Territories
44 Business Partners BP Customers
45 Business Partners BP Vendors
46 Business Partners BP Priorities
47 Sales SalesStages
48 Sales Industries
49 Sales Information Source
50 Sales Commission Groups
51 Sales Relationships
52 Sales Level of Interest
53 Sales Won/Lost Reasons
54 Sales CompetitorInfo
Sales Partners
57 Production Products
58 Production WIP Products
59 Production Production - Bill of Materials
60 Production Sales - Bill of Materials
61 Production Assembly - Bill of Materials
62 Production Template - Bill of Materials
63 CRM ActivityTypes
64 CRM ActivitySubjects
65 CRM ActivityLocations
66 HR Teams
67 HR Employees
68 Service Queue
69 Service ContractTemplates
70 Service Service Call Status
71 Service Service Call Priority
72 Service Service Call Origin
73 Service Service Problem Type
74 Service Service Call Type
75 Service Technicians
76 Service SolutionStatus
77 Administration MultiLanguageTranslations
78 Administration UserLanguages
79 Administration User Defaults
80 Administration Users
81 Administration Departments
82 Administration Branches
83 Administration UserPermissionTree
84 Administration Assign License to users
85 Opening Balance Item Opening Balances
86 Opening Balance G/L Opening Balances
87 Opening Balance BP Opening Balances
88 Configuration/Development oUserFields
89 Configuration/Development oDynamicSystemStrings
90 Configuration/Development oChooseFromList
91 Configuration/Development oQueryCategories
92 Configuration/Development oUserQueries
93 Configuration/Development oFormattedSearches
94 Configuration/Development PLD
95 Configuration/Development QueryPrintLayouts
96 Configuration/Development StoredProcedure
97 Configuration/Development Document Numbering Series
98 Configuration/Development Alerts
99 Configuration/Development Approval Procedures
Regards,
Thanga Raj.K -
Writing GPS data to Exif after aperture import cause lost faces information
i want my GPS informations to be hard written to RAW files (with ExifTool)
if i hard geotag my images after aperture import and faces recognition i loose faces information and i have to redone the job.
bummer ....
apertur should provide such a functionnalityi want my GPS informations to be hard written to RAW files (with ExifTool)
if i hard geotag my images after aperture import and faces recognition i loose faces information and i have to redone the job.
bummer ....
apertur should provide such a functionnality -
Adding BPAddresses after main BP data import
Hi,
I imported the main BP data, now I'm adding the BPAddresses, so, the Record key is the "linking" aspect between the 2 spreadsheets. At the moment I left the LineNum field blank, but as I import using the DTW, it gives this error for the very first Record key:
This entry already exsists in the following table (ODBC -2035) Application-defined or object-defined error
Is this error caused because it is seeing the customer code/entry as duplicate / should the LineNum be set to 1?
The first import did not bring any address info in
ThanksThanks for the help, but I'm still getting this error.
For the very first record key for the CardCode (AAO001-S):
AAO001-S
This entry already exsists in the following table (ODBC -2035) Application-defined or object-defined error
The initial BP data import had no address information at all.
In the DTW I select
Business object = oBusinessPartners
Chooses data files
I link the CSV for BusinessPartners as the source file and the CSV for BPAddresses
The BP addresses format in the spreadsheet:
RecordKey = 1
LineNum = 0
AddressType = bo_BillTo
AddressName = Postal address
RecordKey = 1
LineNum = 1
AddressType = bo_ShipTo
AddressName = Physical address
I'm selecting import new objects
Any ideas as I'm unsure what the problem can be. -
FDM Data import from EBS failed via FDMEE after roll back the 11.1.2.3.500 patch . Getting below error message in ERPI Adapter log.
*** clsGetFinData.fExecuteDataRule @ 2/18/2015 5:36:17 AM ***
PeriodKey = 5/31/2013 12:00:00 AM
PriorPeriodKey = 4/30/2013 12:00:00 AM
Rule Name = 6001
Execution Mode = FULLREFRESH
System.Runtime.InteropServices.COMException (0x80040209): Error connecting to AIF URL.
at Oracle.Erpi.ErpiFdmCommon.ExecuteRule(String userName, String ssoToken, String ruleName, String executionMode, String priorPeriodKey, String periodKey, String& loadId)
at fdmERPIfinE1.clsGetFinData.fExecuteDataRule(String strERPIUserID, String strDataRuleName, String strExecutionMode, String strPeriodKey, String strPriorPeriodKey)
Any help Please?
ThanksHi
Getting this error in ErpiIntergrator0.log . ODI session ID were not generated in ODI / FDMEE. If I import from FDMEE its importing data from EBS.
<[ServletContext@809342788[app:AIF module:aif path:/aif spec-version:2.5 version:11.1.2.0]] Servlet failed with Exception
java.lang.RuntimeException
at com.hyperion.aif.servlet.FDMRuleServlet.doPost(FDMRuleServlet.java:76)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221) -
I am using the Lookout OPCClient driver to connect to AB PLCs (EtherNet/IP protocol) and power measurement equipment (Modbus TCP protocol). The OPC server is the NI OPC Servers. The data that are read out from PLCs and PMs are energy meter readings, energy counters, power, voltage, current, frequency, power factor and el. energy quality measurements (THD). That energy meter readings are being stored in SQL database.
I am experiencing a serious problem regarding the accuracy of the meter readings. Several times per day, randomly, meter readings are losing the time sequence. For example, sequence is: 167, after few seconds 165, 166. In other words, present value followed by two previous old values. That generates a serious problem in our application that is expecting a naturally rising sequence of counter values.
Analyzing further, I isolated the problem to the connection between Lookout OPCClient and OPC Server. I made a simple application in Lookout 6.7 (opcproc.lkp, attached) with OPCClient parameters: NIOPCServers, OPC2, Asynchronus I/O, Update rate: 10000, Deadband: 0.0, that is reading just one tag from NI OPC Servers demo application (simdemo.opf).
By using OPC diagnostic tool from NI OPC Servers I record the sequence of OPC requests and responses. I found out that OPCClient sends every 2.5 sec “IOPCAsyncIO2::Refresh2()” call that is request for refreshing of all items in one OPC group. Few milliseconds later OPC Sever responds with callback function “IOPCDataCallback:: OnDataChange()(Device Refresh)” that actually refresh the data.
This periodic sequence is intrinsic to the OPCClient and cannot be disabled or changed (by my knowledge). This sequence is periodically interrupted by “IOPCDataCallback:: OnDataChange()” caused by update rate parameter of OPCClient (client is subscribed to server for periodic update of changed items).
In the case of demo application on every 4 refresh callbacks caused by refresh requests (2.5 sec) there is one update subscription callback determined by Update rate (10 sec).
QUESTION 1:
What is the purpose of update sequence and update rate when we have every 2.5 sec fresh values?
PROBLEM
The problem arises when we have a large number of items in OPC group. In that case the OPC Server starts to queue refresh requests because they cannot be fulfilled in 2.5 sec time because of large number of I/O points that must be scanned. At the same time update subscription callbacks are running at the period determined by Update rate. I observed in my production system that regular update callbacks has higher priority than refresh callbacks from the queue. That causes the loosing of timed sequence of data. After the update callback with fresh data, sometimes follow one or two refresh callbacks from queue with old (invalid) data. By adjusting Update rate parameter (1 hour, 2hours …) I can postpone the collision of data refreshes but I cannot eliminate it. Furthermore, the 2.5 sec automatic refresh are large burden for systems with many I/O points.
QUESTION 2:
Is there a way to disable automatic refresh request every 2.5 sec and just use update requests determined by Update rate?
QUESTION 3:
Is there a way (or parameter) to change the period of automatic refresh (2.5 sec)?
This problem is discovered for Lookout 6.5, 6.6 and 6.7 so I could say it is intrinsic to OPCClient. If I use synchronous I/O requests there is not an automatic refresh, but that is not an option for large systems.
Thanks!
Alan Vrana
System engineer
SCADA Projekt d.o.o.
Picmanova 2
10000 ZAGREB
CROATIA
T +385 1 6622230
F +385 1 6683463
e-mail [email protected]
Alan Vrana
SCADA Projekt d.o.o.
ZAGREB, Croatia
Attachments:
opcproc.zip 4 KBThe physical connection from LV to the switch is (I believe) copper crossover to fiber converter into a switch. Then, fiber from the switch to the end device (relay). The relay has all of the typical modbus registries and has been verified by inducing signals in to the system and measured/polled in LabVIEW and observed Variable Monitor. I am working with LV 8.2 and 8.5.
An OPC server would only add an additional translation of addressing within the configuration. The only real draw back would be the network overhead required to do this processing and not being representative of the end design configuration.
I will reiterated my question in another way:
I must answer the question to management that relates to data collection, test results and analysis; how often are you polling the client in relation to the outcomes measured? At this time I can not point at any configuration in the set up and execution that directs the data framing rate. I only measure the traffic and work with results. This needs to be clearly identified based on the relay modbus/tcp design capability of supporting an fixed number of client requests per second.
For testing purposes, I would like to be able to stress the system to failure and have prove capabilities with measured data. The present problem is that I have no basis to establish varying polling rates that effect the measured data transmission.
This raises another question. What handles the Variable Monitor data requests and how is this rate determined?
Thanks for your interest in my efforts.
Steve -
Data Import defaulting Null date to system date
I am trying to import Campaigns using Data Import, and when End Date is NULL it defaults to system date. Is there any way to avoid this default? Even when I don't "map" the End Date, it still defaults it.
Thanks!This file will always populate itself what you have to try to do is establish what is a good date to put in this section. What i have done is created a System default to be today() + 14 this will mean that the end date will be roughly 14 days after it was created. This is then up to the person running the campaign to change if required.
-
PE 9 lockup after HD import from camcorder
Hi folks, I've been using PE for many years. Currently have PE9 installed, all the updates, drivers, QuickTime, etc. up to date. No problems at all, I've produced many videos. But all of a sudden on the last few times I imported from the camcorder (Canon HV20), it imports fine but when it's done, PE9 locks up. It has the video on the timeline, but doesn't show it in the video display window. I have to Ctrl/Alt/Del, which then brings up the "PE9 stopped unexpectedly" Windows error.
When I start it back up, the video is there and saved and everything works fine. So I can work around this, but not sure why all of a sudden it's hanging after I import video, the same way I've been doing it for years? And like I said, no problems doing this with PE9 until recently. No other changes to the computer, so I'm not sure what's up, any suggestions? (I searched around in the forum but didn't find anything).
Thanks.There are quite a few things to check, and many are included in this ARTICLE and in its links. For driver updates, do not trust Windows, or any update utility. Go to the mfgr's. Web site, plug in your audio, or video card make/model, plus your OS, then download and install the latest driver.
Next, if the situation does not improve, please post the full details of your system. See this ARTICLE. Pay special attention to the I/O sub-system. This ARTICLE will give you background and tips.
Good luck, and please report back,
Hunt -
Hard Drive dead? - Failed to Issue COM RESET successfully after 3 attempts.
Hello, i think my 13" 2.53Ghz Unibody (mid2009) MBP's hard drive has had it's last spin. I was using it normally in bootcamp, when windows7 froze, so I rebooted holding the power button....windows wouldnt start any more...ok, so i tried to boot into OSX...stuck at the loading screen with the apple.
Went into verbose mode and it stops at "Failed to issue COM RESET successfully after 3 attempts. Failing..." Google says this has to do with the hard drive.
Did a extensive hardware test (boot + D), all OK, booted from install disk and tried to run disk utility, but it crashed when trying to locate the drives (didn't even list the drive on the right pane). Single User mode wont load either, will give the same COM RESET message. I am literally out of options here.
Is there something i can do to either fix this or verify 100% that my hard disk is truly dead?
I have a backup in timemachine so the data is ok, but the timing couldnt be worse, i have no time to fix this now and needed a working laptop, highly frustrating.
thanks for any tipsnobrex wrote:
Hello, i think my 13" 2.53Ghz Unibody (mid2009) MBP's hard drive has had it's last spin. I was using it normally in bootcamp, when windows7 froze, so I rebooted holding the power button....windows wouldnt start any more...ok, so i tried to boot into OSX...stuck at the loading screen with the apple.
Went into verbose mode and it stops at "Failed to issue COM RESET successfully after 3 attempts. Failing..." Google says this has to do with the hard drive.
Did a extensive hardware test (boot + D), all OK, booted from install disk and tried to run disk utility, but it crashed when trying to locate the drives (didn't even list the drive on the right pane). Single User mode wont load either, will give the same COM RESET message. I am literally out of options here.
Is there something i can do to either fix this or verify 100% that my hard disk is truly dead?
I have a backup in timemachine so the data is ok, but the timing couldnt be worse, i have no time to fix this now and needed a working laptop, highly frustrating.
thanks for any tips
Welcome to the Apple Forums!
Your HDD is probably dead, since the error doesn't appear to indicate something wrong with the motherboard. Just buy a new 2.5" HDD. I don't know about prices in Brazil, but here in the US, a 1TB drive runs for around US$100. Also, check on eBay for people who upgraded their drive and are selling their Apple drive. Some will ship to Brazil, and you can score those for $50-100.
I hope you backed up your hard drive. If not, you'll have to do the whole reinstallation process. -
GL : after journal import currency conversion type change to User
Hi
I have following data in GL_Interface table
User_currency_conversion-type = 'Corporate'
Currency_conversion_date = '9/21/2008' ( Get from Gl_Daily_rates)
Currency_conversion_rate = Null
As per GL_Daily_Rates the conversion_rate is 1.5 for this currency to functonal currency.
After journal import, gl_je_header is populated with
User_currency_conversion-type = 'User'
Currency_conversion_date = '9/21/2008'
Currency_conversion_rate = 1
Can you please explain, why the User_currency_conversion-type and Currency_conversion_rate are changed to above values ?
If you have any good document of "Journal Import Rules and Logic" , please share
Thaks in AdvanceHello Mr. G. Lakshmipathi,
Thanks! I'll try this function module READ_EXCHANGE_RATE.
However, I think this issue is not really related to the exchange rate. The amount conversion from USD to JOD happens correctly based on the current pricing forumla that we are using i.e. amounts in JOD have 3 decimals after conversion from USD, but when they are stored in pricing structure, the last decimal is truncated as the currency against the 3 condition types is USD (USD has 2 decimals). Hence, the amounts have 2 deicmals.
Thanks,
Amit
Maybe you are looking for
-
Is there a way to use iCloud Photo Library WITHOUT syncing?
The idea I had of iCloud Photo Library was that I could upload my pictures to it and access them from all my devices WITHOUT having to sync them. At the end of every month, I move all the photos from my phone to my computer. It helps me save space an
-
Inserting a Mac command symbol into a document
Ok, this should be so simple but it's doing my head in. Basically, I'm preparing an indesign document - specifically a clue card sheet for call centre workers who use a Mac platform. I cannot for the life of me figure out how to insert the various sy
-
Enabling usage rights in Adobe Reader corrupts PDF text
Hi, I've created a registration form using Open Office writer which has editable fields. I convert this to a PDF and then add a signature field in Acrobat 8. At this point everything works fine, all the editable fields etc convert across, but I want
-
Startup - can't call it a crash - it just disappears
Hi After seeing two other threads I followed the instructions and found it has to be something to do with fonts. However, I g through the process of: 1) removing them all and GB starts fine 2) putting back all in system font folder and GB starts fine
-
ITUNES STORE ISN'T WORKING?
I've tried restarting and downloading the latest version of itunes.. the rest of itunes is working great it's just the store.. whenever i click the tab it starts loading, gets to half way an djust stops and freezes.... HELP