Carbon to QTKit value conversion question
I am rewriting an application in Cocoa which was originally developed in Carbon. (It used Flash movies for custom interface elements, which QuickTime no longer allows, and Cocoa makes redoing the custom interface much easier, so this isn't as weird of a task as it may sound.)
One of the things this program has to do is read a file containing a series of numbers, and have QuickTime jump a movie to points represented by the numbers. (The app is multiply-deployed with different movies, so the numbers have to be editable by a non-programmer.) The "read a file containing a series of numbers" part is pretty simple, but I'm lost when trying to figure out what to do with the numbers once I've got 'em.
The old Carbon code (circa 2001) says:
Movie theMovie = <pre-loaded movie>;
long jumpValue = <value from file>;
TimeRecord theMovieTimeRecord;
GetMovieTime( theMovie, &theMovieTimeRecord );
theMovieTimeRecord.value.lo = jumpValue;
SetMovieTime( theMovie, &theMovieTimeRecord );
So the question is: if I have a QTMovie containing the movie, and an NSNumber containing the number that was in jumpValue, what is the fastest way to make Cocoa do the equivalent jump? (Or, better yet, convert it to a QTTime?)
And as a followup, is there any easy way to get a callback when a QTMovie reaches a particular QTTime?
I have an answer to the main question, which falls in the "duh" category: use the "quickTimeMovie" method on the QTMovie to get the Movie primitive, and then just use the code exactly as-is (or as-was). It functions perfectly well, so yay.
Given that I'm using a non-standard timing system, I suppose I'll have to use a kludgey workaround to watch for the movie reaching particular points. (For example, an NSTimer which calls a checking function several times per second to see how far the movie has gone. Not elegant, but there are worse things to do.)
Similar Messages
-
UPGRADE 1103 to 1159 - PAYMENTS CONVERSION QUESTION!
Hello,
UPGRADE 1103 to 1159 - PAYMENTS CONVERSION QUESTION!
This is the scenario, we are planning to do. Currently we have 11.0.3 in one box. We are planning for re-implementation for 11.5.9 in the new box. So we
have to convert approx 14 months of open and closed data for the historic purposes. (Like PO's, Invoices, Payments and ledger data etc.)
Again, these conversions cannot be handled through the front end using some Data Loader kind of tool etc...
Data Loader tool, I used a lot for loading Flexfield Value Set Values, Category Codes etc...
But my question is converting these AP_PAYMENTS related data in the back end. (Like Check num, Payment Batch ID, Invoice Number, Invoice ID, Supplier Number,
Supplier ID, etc.) from old 11.0.3 box to new 11i box.
Since there was some confusion is going on with Payments related areas, because there was now API's etc... some of them are telling it is very difficult to migrate "AS IS" into 11i. I also heard that oracle won't support this.
Just want to clarify clearly on this issues.
Because we are moving from the small box of 11.0.3 into Big box for 11.5.9. So we don't want to loose all the histories for about 20 months. especially open
Closed / Open - Invoices, Payments, PO's, REQ's, Active Items which are in the Open Blanket PO's, Employees, etc...
Of all these everything sounds like doable because they do have API's etc... Of course, I belive there is no API for the Vendors only. We want to know is this true for the Payments also.
Any urgent clarifications is very much appreciated with clear explanation.
Thanks in advance
Mani VaradanThe payment conversion is little bit tricky in term on its understanding and functionality.
There are lot of dependices of payments conversion on the others objects like invoice.
You need to be very clear in requirements.
Here is the extracted documents for giving you clear understanding about payments.
Overview of Process
Prior to creating a payment batch, four setup are required.
A.Define a bank for bank branches with which you do business.
B.Define bank accounts.
C.Choose a predefined or custom payment format.
D.Finally, define payment documents for disbursements for each bank account.
To give an overview of the steps required to create payment batch payments:
1) Initiate the payment batch by entering the criteria for invoices you want to pay. The payment batch process selects invoices, and then builds the payments. It determines which invoices will be paid on each payment document, and lists this information for you on the Preliminary Payment Register.
2) Make any necessary modifications to your payment batch, such as selecting additional invoices, deselecting invoices, deselecting suppliers, etc. Once modifications are complete, the modify payment batch process automatically builds the payments if any invoices were added.
3) Format payments to produce an output file.
4) Print checks from the output file if you are not creating electronic payments or sending the output file to a third party for printing.
5) Confirm the payment batch by recording the document numbers associated with each payment. During this step, the invoice status is updated to Paid and a document number is associated with the invoice and invoice payment.
As conversion(from 11.0.3 to 11.5.9) is concern step 3,4 is excluded from the standard processes.
These are the tables involved during the payment conversion.
AP_SELECTED_INVOICES_ALL
AP_INV_SELECTION_CRITERIA_ALL
AP_INVOICES_ALL
AP_PAYMENT_SCHEDULES_ALL
These are background (concurrent) processes involved in batch process consists of the following
a. Autoselect
b. Build Payments
c. Preliminary Payment Register
d. Format
e. Confirm
f. Final Payment Register.
10.7:
If you are using the application in character mode, navigate to the Reset Payment Batch form. The navigation path is
\Navigate --> Controls --> Payment --> ResetPaymentBatch.
Select the payment batch name and the system displays the status of the batch. Your options for proceeding vary according to the batch status.
11.5.8/9
You can perform actions on your batch in Payment Batches form by clicking on the Actions button. The appropriate actions are highlighted depending on the status of the batch.
THIS SECTION TALK ABOUT THE DETAIL TECHNICAL OVERVIEW
Payment Conversion Technical Overview
For the conversion is concern we are following four-step process
I.Defined and prerequisite steps as discussed above
II.Selecting the Invoice
III.Building the Payment
IV.Confirm
THESE ARE THE PREREQUISITE SETUP
We are assuming the bank set is already done from 10.7 to 11i.Bank set up means bank account id, and its corresponding payment format.
For this purpose we are using mapping function for bank account id and on that basis we are getting corresponding checks stock id.This means defining a payment documents for disbursements
SELECT INVOICES
When a payment batch is started one row for each record is created in the
AP_INV_SELECTION_CRITERIA_ALL table.
This table stores the criteria that a payment batch uses to select invoices for payments. The module name for the Auto Select process is APXPBSEL. The Select Invoices, or AutoSelect, is the first step in the payment batch process.
When we run a concurrent program the AutoSelect process starts by loading records that meet the invoice selection criteria into the table AP_SELECTED_INVOICES_ALL from the tables AP_INVOICES_ALL and AP_PAYMENT_SCHEDULES_ALL.
The criteria that AutoSelect uses to determine which record to select from AP_INVOICES_ALL and AP_PAYMENT_SCHEDULES_ALL is stored in AP_INV_SELECTION_CRITERIA_ALL.
To be sure that no duplicate invoices get selected, invoices that are already selected in another payment batch are not selected. Invoice payments that have no remaining payment amounts are not selected either.
BUILD PAYMENTS
The module name for the Build Program is APXPBBLD. From the list created by AutoSelect, the Build program determines which invoices will be paid on each
Payment document, and lists this information on the Preliminary Payment Register if Selected. The Build program is spawned when AutoSelect has completed. When the Build Program starts, the application uses information from the table AP_SELECTED_INVOICES_ALL to create rows in the table AP_SELECTED_INVOICE_CHECKS_ALL.
In addition to creating rows in the AP_SELECTED_INVOICE_CHECKS_ALL, the build payments program also performs the following tasks:
Assigns document numbers for payments
Assigns Check Ids
Renumbers the remaining documents starting from the first_available_document
Updates the payment batch status to BUILT
Since we are doing conversion from 10.7 to 11i where all payments need to converted hence we are keeping payment batch status as FORMATTED
CONFIRM
This is the final step for the payment batch process, where we will pick a payment batch from front end, and then making confirm. The menu for this
AP Payable =>Payment Batches form =>by clicking on the Actions button
Internally the confirm program first updates the AP_SELECTED_INVOICE_CHECKS_ALL records,
then transfers those records to
AP_CHECKS_ALL.
In addition, the AP_SELECTED_INVOICES_ALL
records are transferred into the corresponding
AP_INVOICE_PAYMENTS_ALL,
AP_INVOICES_ALL and
AP_PAYMENT_SCHEDULES_ALL.
Table Involved in Payment Batch Process
These are the various table involved in the payment process of 10.7 and 11i.It is recommended that if corresponding earlier conversion is done then use the mapping function to get the new value.
oAP_SELECTED_INVOICES_ALL
oAP_INV_SELECTION_CRITERIA_ALL
oAP_INVOICES_ALL
oAP_PAYMENT_SCHEDULES_ALL
oAP_SELECTED_INVOICE_CHECKS_ALL
oAP_CHECKS_ALL
oAP_INVOICE_PAYMENTS_ALL
oAP_PAYMENT_SCHEDULES_ALL
oAP_INVOICES_ALL
oAP_BANK_ACCOUNTS_ALL `
Common Issue:
1.Determine the status of the Payment Batch?
A: Query up the Payment Batch in the Payment Batches Summary window to view the status. If you can not get into the application, you can select the status from SQL*Plus:
SELECT status FROM ap_inv_selection_criteria_all
WHERE checkrun_name = ?<payment batch name>?;
2. Dont change the last update date
We are doing the conversion for payments, hence we cant change he last_updated_date to sysdate as this will create a problem by concurrent program, in that case we need to populate the field for FUTURE_DATED_PAYMENT=N
3. dont populate the value for INVOICE_PAYMENT_TYPE
If there are some prepayments in the conversion process donor populate this field and pass NULL, otherwise this will create a duplicate invoice, which will cause a problem for concurrent batch for payment.
Hope now you will be very clear about the process.
let me me offline, if you need further.
regards
sanjit
[email protected] -
Hello,
I am trying to meet the new business requirements for the client I am working at. They recentlyd decided to eliminate all company codes and condense to one purchase organization. I have my purchase org and company code data in a Tuple.
Can I use the value conversion filter to all data in my repository to elimate company codes and condense the purchase orgs? I guess my question really is can you use the value conversion filter if the field values to be converted are in Tuples?
Thanks experts!Hey ,
if you apply value conversion filters at the field level, the conversions are applied sequentially to each converted value that has not been individually edited or converted at the value level, now when you first map a field, each value in the Converted Value column appears in gray to indicate that it is inheriting value conversion filters from the field level. When you apply a manual edit or a value-conversion filter to an individual value, the Converted Value then appears in black to indicate that you have overridden inheritance from the field level. You can use the Restore Converted Value command to restore inheritance for a value.
Value conversion filters automate repetitive and error-prone transformations, eliminating manual typing and the possibility of user error. They also allow powerful reformatting algorithms to be applied to an entire group of values.
No conversion filter is necessary to trim leading and trailing spaces from source values, as Import Manager does this automatically when importing values into an MDM repository.
Hope it gives you clarity.
Deep -
Creating Value Conversion Filters in Import Manager
Hello.
I am trying to create some Value Conversion Filters that change street suffixes to abbreviations. Such as Lane to Ln.
How would I go about doing this? Am I supposed to use the find/replace Conversion Filter?
Points awarded to any helpful answers / helpful guides (besides the Import Manager Guide) <--- I have this already
Thanks,
NicholeHi,
Yes u can use the Find/Replace Conversion filter property in Import manager.
In this u can write the word u want to change and then write the req format.
But the conversions are permanent while if u do this in MDM data manager under transformations they are temporary.i.e they will not be syndicated.
Hope this may help u.
Rgds]
Ankit
Edited by: ankit jain on Aug 6, 2008 5:06 AM -
Hi everybody,
i need a small help reg value conversion.
for suppose:
1>a = 1.234.567,89 in sap format
we need to convert this to a = 1,234,567.89 into flat file.
2>date format is showing as DD.MM.YYYY in sap format
we need to convert this to MM/DD/YYYY
(no matter which user is used to run the program)
please give me reply.Hi Sunil,
To dispaly the Date form System Format to User Format the FM 'CONVERT_DATE_TO_EXTERNAL' is used.
Consider this code.
REPORT zztest_arun_2.
DATA: e_date(10) type c VALUE '2/2/2006',
i_date(10).
CALL FUNCTION 'CONVERT_DATE_TO_INTERNAL'
EXPORTING
date_external = e_date
IMPORTING
date_internal = i_date
EXCEPTIONS
date_external_is_invalid = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
WRITE : / 'CONVERT_DATE_TO_INTERNAL',
/ 'My Date:' , e_date ,
'Conv Date:', i_date.
skip 2.
CALL FUNCTION 'CONVERT_DATE_TO_EXTERNAL'
EXPORTING
DATE_INTERNAL = sy-datum
IMPORTING
DATE_EXTERNAL = e_date
EXCEPTIONS
DATE_INTERNAL_IS_INVALID = 1
OTHERS = 2
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
WRITE : / 'CONVERT_DATE_TO_EXTERNAL',
/ 'My Date:' , sy-datum,
'Conv Date:', e_date.
Thanks,
reward If Helpful. -
Value conversion during syndication.
Hi All,
I need to do a value conversion during syndication of records out of MDM. The values maintained in my MDM Repository is true and false while the destination system can only read Y and N. What would be the best option to have this value conversion done during the auto syndication?
Appreciate your responses.
- AdityaOne way to handle this is by using key mapping on a lookup table instead of a boolean field. Often times your remote systems will have different internal values than the ones you are storing in your system. For example, you may store "True" or "False" as values in your MDM repository, however ECC doesn't accept those values, instead it wants "X" or "NULL". Likewise, a third-party system may expect "T" or "F". The same thing goes for other fields in your repository (ie: country codes, region values, etc). In cases such as this, you should create a lookup table with key mapping enabled. Then, you can maintain different values based on the remote system. In other words, you may store the value "TRUE" in your lookup table, but the key mapping for true may maintain "X" for ECC and "T" for a legacy system that you define in the console. Then, when you build your syndication map, you may the remote key to the destination field instead of mapping the actual code value of your repository. This way the correct value will always get syndicated to the partner system regardless of what system is being syndicated to.
Does that make sense? -
Keytool Error: DER Value Conversion
I'm trying to import a certificate into a Keystore and I keep getting the following error:
Keytool error: java.lang.IllegalArgumentException: DER value conversion
I'm running this on a W2K3 box with Java SDK 1.4.2. I have imported this certificate on two other servers that have the same configuration with no problems. I've been trying to fix this thing for days and it has become very frustrating.
Message was edited by:
JayWDid you ever get any information on this?Yes, and I apologize for neglecting to post it here.
First, I didn't get the aix 1.3 ssl working with my code, but I am pretty sure I could have if I'd just change the packages I imported. What we have done with our customers using Java 1.3 on aix is have them keep using the old Sun JSSE 1.0.2 for the time being. And we have our solaris customers stick to Java 1.3 so we have only one version of the code.
In Java 1.4, ssl is bundled into Java. The ssl classes which were in vendor-specific packages like ibm.com and sun.com, will now be in javax.net.ssl.
This information is from Brad Wetmore of the Java Security Team at Sun.
So as soon as we can get Java 1.4 on AIX, we're moving to that and using the javax.net.ssl package. -
Hello,
I try to make an URL connection to a secure site with JSSE 1.0.3. I get the exception 'DER Value conversion' printed below, when I connect to an IIS server with a Baltimore test servercertificate. Does someone know how to solve this problem?
Thanks, Arianne.
java.lang.IllegalArgumentException: DER Value conversion
java.lang.String sun.security.x509.AVA.toString()
java.lang.String sun.security.x509.RDN.toString()
void sun.security.x509.X500Name.generateDN()
java.lang.String sun.security.x509.X500Name.toString()
java.lang.String sun.security.x509.X500Name.getName()
com.sun.net.ssl.internal.ssl.X500Name com.sun.net.ssl.internal.ssl.X500Name.a(java.security.Principal)
java.security.cert.X509Certificate com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.a(java.security.cert.X509Certificate, java.util.Date)
java.security.cert.X509Certificate[] com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.a(java.security.cert.X509Certificate[], java.util.Date)
boolean com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.a(java.security.cert.X509Certificate[], java.lang.String)
boolean com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.isServerTrusted(java.security.cert.X509Certificate[], java.lang.String)
boolean com.sun.net.ssl.internal.ssl.JsseX509TrustManager.isServerTrusted(java.security.cert.X509Certificate[], java.lang.String)
void com.sun.net.ssl.internal.ssl.ClientHandshaker.a(com.sun.net.ssl.internal.ssl.HandshakeMessage$CertificateMsg)
void com.sun.net.ssl.internal.ssl.ClientHandshaker.processMessage(byte, int)
void com.sun.net.ssl.internal.ssl.Handshaker.process_record(com.sun.net.ssl.internal.ssl.InputRecord)
void com.sun.net.ssl.internal.ssl.SSLSocketImpl.a(com.sun.net.ssl.internal.ssl.InputRecord, boolean)
void com.sun.net.ssl.internal.ssl.SSLSocketImpl.a(com.sun.net.ssl.internal.ssl.OutputRecord)
void com.sun.net.ssl.internal.ssl.AppOutputStream.write(byte[], int, int)
void java.io.OutputStream.write(byte[])
void com.sun.net.ssl.internal.ssl.SSLSocketImpl.startHandshake()
java.net.Socket com.sun.net.ssl.internal.www.protocol.https.HttpsClient.doConnect(java.lang.String, int)
void com.sun.net.ssl.internal.www.protocol.https.NetworkClient.openServer(java.lang.String, int)
void com.sun.net.ssl.internal.www.protocol.https.HttpClient.l()
void com.sun.net.ssl.internal.www.protocol.https.HttpClient.<init>(javax.net.ssl.SSLSocketFactory, java.net.URL, boolean)
void com.sun.net.ssl.internal.www.protocol.https.HttpsClient.<init>(javax.net.ssl.SSLSocketFactory, java.net.URL)
com.sun.net.ssl.internal.www.protocol.https.HttpClient com.sun.net.ssl.internal.www.protocol.https.HttpsClient.a(javax.net.ssl.SSLSocketFactory, java.net.URL, com.sun.net.ssl.HostnameVerifier, boolean)
com.sun.net.ssl.internal.www.protocol.https.HttpClient com.sun.net.ssl.internal.www.protocol.https.HttpsClient.a(javax.net.ssl.SSLSocketFactory, java.net.URL, com.sun.net.ssl.HostnameVerifier)
void com.sun.net.ssl.internal.www.protocol.https.HttpsURLConnection.connect()
java.io.InputStream com.sun.net.ssl.internal.www.protocol.https.HttpsURLConnection.getInputStream()
java.io.InputStream java.net.URL.openStream()
void URLReader.main(java.lang.String[])
Exception in thread main
Process exited with exit code 1.Sure, this is a solution if you start from zero but we have the same problem and it did work with iPlanet4.0 and jre 1.2.2 but now that we upgraded to iPlanet 6.0 and 1.3.1 and having an attribute with no value like PHONE=; causes the NSServletRunner to crasch. And we have a few hundred certificates with some values missing. How can this be fixed?
-
GL Balances Conversion Question
Hi,
Recently I started working in oracle apps so you can consider me as a beginer.Now; I'm working on a GL Conversion project. We have balances from 2003. It has been decided that we will convert the balances from 2003-2006 and for 2007, 2008 we will convert in detail (every journal). We also use multiple currencies. My question is on extracting the balances. There are various values for Translated flag and some thing about BEQ columns.
I will appreciate, if you can tell me what balances I should extract for conversion purposes.
Thanks for your help !!
Shreekar.Can you give me details of what you have done?? i have the same issue and i do not know what has to be done
Thank you! -
How to read no value(conversion file)
hI Friends,
i'm loading actuals data from bw to bpc application. some of records in bw cube are not having data for one characterstic (eg: master cost center).
1. i want to drop those records which are not having data, how can i achieve this through conversion file,what content can i put in external column? and in internal column i will use *skip.
2. if i want to put some new value for records having no value for one characterstic(eg: master cost center). how can i assign that in conversion file? i.e what can i put in external col.
Above 2 question relates to conversion file, that same characterstic which is having no value in bw cube. For many records i have master cost center value, but here i want to handle no value.
Thanks,you are correct, i want to handle no value situation(alv cell is having no value) for master cost centre infoobject of bw cube.
I also get similar idea like you as follows
External col: *
Internal col: *if(js:%external%.length=0 then CC_A, dummny parameters). But i haven't tested and now analysing data in bw cube.
I would like to know any other keywords are there, why because in my bw cube around 3 characterstics are having no value(similar) to above. Yes by above way, one can maintain 3 conversion files.
Regards,
Naresh -
Foreign Currency Valuation Values Conversion
Hello SAP Experts!
We are migrating from 4.6B to 4.7 and we are working in the vendor, customer and G/L accounts open items conversion.
In the present system (4.6B) the users use transaction F.05 for foreign
currency valuation with the flag "Bal. sheet preparation valuatn" activated. This means, that the valuation difference is not reversed
but it is stores in the field BSEG-BDIFF of the affected open item.
Now we are trying to convert those foreign currency open items with a
batch input to transaction FB01. However, the fields BSEG-BDIFF do not appear in the dynpros and we could not find a way to make them optional to enter the value of previous revaluations.
We have thought of transaction F-05, but there is no way there to reference the revaluation that is being posted to the affected open item. T
If we do not enter this amount in that field we will have problems
after the go live to pay those items, as the "Bal. sheet adj. 1"
account balance will never be zero and the gain/loss accounts for exchange rate difference will be duplicated.
Do anyone know how can this be done?
Hope you can help me on this one.
Many thanks in advance.
Regards!
NoeliaHi Dominic!!
Thanks a lot for your answer. Yes, they are separate systems
Let me see if I understand your suggestion:
1) I should do a manual posting through F-05 in our 4.6B system bringing the balance adjustment account and the exchange rate difference account to zero.
2) Transfer the balances to 4.7.
3) In the first closing period run the automatic foreign currency valuation
(through F-05) with the flag "Bal. sheet preparation valuatn" deactivated so that the system revaluates the open items from the time the open item is created to that moment.
is it like that?
Many thanks for your help again!
Best regards,
Noelia -
ABAP Rountine Conversion question
This is a routine that was migrated from 3.5. Here is the code I have, I've bolded the area that's throwing the error message:
PROGRAM trans_routine.
CLASS routine DEFINITION
CLASS lcl_transform DEFINITION.
PUBLIC SECTION.
Attributs
DATA:
p_check_master_data_exist
TYPE RSODSOCHECKONLY READ-ONLY,
*- Instance for getting request runtime attributs;
Available information: Refer to methods of
interface 'if_rsbk_request_admintab_view'
p_r_request
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.
PRIVATE SECTION.
TYPE-POOLS: rsd, rstr.
Rule specific types
TYPES:
BEGIN OF tys_SC_1,
Field: SHKZG Debit/credit.
SHKZG TYPE C LENGTH 1,
Field: ZZ_QUAN Quantity.
ZZ_QUAN TYPE P LENGTH 7 DECIMALS 3,
END OF tys_SC_1.
TYPES:
BEGIN OF tys_TG_1,
InfoObject: 0COPASLQTY Sales quantity.
COPASLQTY TYPE /BI0/OICOPASLQTY,
END OF tys_TG_1.
$$ begin of global - insert your declaration only below this line -
... "insert your code here
$$ end of global - insert your declaration only before this line -
METHODS
compute_0COPASLQTY
IMPORTING
request type rsrequest
datapackid type rsdatapid
SOURCE_FIELDS type tys_SC_1
EXPORTING
RESULT type tys_TG_1-COPASLQTY
monitor type rstr_ty_t_monitor
RAISING
cx_rsrout_abort
cx_rsrout_skip_record
cx_rsrout_skip_val.
METHODS
invert_0COPASLQTY
IMPORTING
i_th_fields_outbound TYPE rstran_t_field_inv
i_r_selset_outbound TYPE REF TO cl_rsmds_set
i_is_main_selection TYPE rs_bool
i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
i_r_universe_inbound TYPE REF TO cl_rsmds_universe
CHANGING
c_th_fields_inbound TYPE rstran_t_field_inv
c_r_selset_inbound TYPE REF TO cl_rsmds_set
c_exact TYPE rs_bool.
ENDCLASS. "routine DEFINITION
$$ begin of 2nd part global - insert your code only below this line *
$$ end of rule type
TYPES:
BEGIN OF tys_TG_1_full,
InfoObject: 0COMP_CODE Company code.
COMP_CODE TYPE /BI0/OICOMP_CODE,
InfoObject: 0FISCPER Fiscal year / period.
FISCPER TYPE /BI0/OIFISCPER,
InfoObject: 0AC_DOC_NO Accounting document number.
AC_DOC_NO TYPE /BI0/OIAC_DOC_NO,
InfoObject: 0ITEM_NUM Number of line item within accounting docum
*ent.
ITEM_NUM TYPE /BI0/OIITEM_NUM,
InfoObject: 0FI_DOCSTAT Item Status.
FI_DOCSTAT TYPE /BI0/OIFI_DOCSTAT,
InfoObject: 0CHRT_ACCTS Chart of accounts.
CHRT_ACCTS TYPE /BI0/OICHRT_ACCTS,
InfoObject: 0GL_ACCOUNT G/L Account.
GL_ACCOUNT TYPE /BI0/OIGL_ACCOUNT,
InfoObject: 0ACCT_TYPE Account type.
ACCT_TYPE TYPE /BI0/OIACCT_TYPE,
InfoObject: 0SP_GL_IND Special G/L indicator.
SP_GL_IND TYPE /BI0/OISP_GL_IND,
InfoObject: 0AC_DOC_TYP Document type.
AC_DOC_TYP TYPE /BI0/OIAC_DOC_TYP,
InfoObject: 0POST_KEY Posting key.
POST_KEY TYPE /BI0/OIPOST_KEY,
InfoObject: 0FISCVARNT Fiscal year variant.
FISCVARNT TYPE /BI0/OIFISCVARNT,
InfoObject: 0DOC_DATE Document Date.
DOC_DATE TYPE /BI0/OIDOC_DATE,
InfoObject: 0PSTNG_DATE Posting date in the document.
PSTNG_DATE TYPE /BI0/OIPSTNG_DATE,
InfoObject: 0CREATEDON Date on which the record was created.
CREATEDON TYPE /BI0/OICREATEDON,
InfoObject: 0VALUE_DATE Value Date.
VALUE_DATE TYPE /BI0/OIVALUE_DATE,
InfoObject: 0CLEAR_DATE Clearing date.
CLEAR_DATE TYPE /BI0/OICLEAR_DATE,
InfoObject: 0CLR_DOC_NO Clearing Document Number.
CLR_DOC_NO TYPE /BI0/OICLR_DOC_NO,
InfoObject: 0CO_AREA Controlling area.
CO_AREA TYPE /BI0/OICO_AREA,
InfoObject: 0COSTCENTER Cost Center.
COSTCENTER TYPE /BI0/OICOSTCENTER,
InfoObject: 0PROFIT_CTR Profit Center.
PROFIT_CTR TYPE /BI0/OIPROFIT_CTR,
InfoObject: 0COORDER Order number.
COORDER TYPE /BI0/OICOORDER,
InfoObject: 0WBS_ELEMT Work Breakdown Structure Element (WBS Elem
*ent).
WBS_ELEMT TYPE /BI0/OIWBS_ELEMT,
InfoObject: 0PLANT Plant.
PLANT TYPE /BI0/OIPLANT,
InfoObject: 0BUS_AREA Business area.
BUS_AREA TYPE /BI0/OIBUS_AREA,
InfoObject: 0FUNC_AREA Functional area.
FUNC_AREA TYPE /BI0/OIFUNC_AREA,
InfoObject: 0PART_PRCTR Partner profit center.
PART_PRCTR TYPE /BI0/OIPART_PRCTR,
InfoObject: 0PCOMPANY Partner company number.
PCOMPANY TYPE /BI0/OIPCOMPANY,
InfoObject: 0PBUS_AREA Trading partner business area of the busin
*ess partner.
PBUS_AREA TYPE /BI0/OIPBUS_AREA,
InfoObject: 0LOC_CURRCY Local currency.
LOC_CURRCY TYPE /BI0/OILOC_CURRCY,
InfoObject: 0DEBIT_LC Debit amount in local currency.
DEBIT_LC TYPE /BI0/OIDEBIT_LC,
InfoObject: 0CREDIT_LC Credit amount in local currency.
CREDIT_LC TYPE /BI0/OICREDIT_LC,
InfoObject: 0DEB_CRE_LC Amount in Local Currency with +/- Signs.
DEB_CRE_LC TYPE /BI0/OIDEB_CRE_LC,
InfoObject: 0DOC_CURRCY Document currency.
DOC_CURRCY TYPE /BI0/OIDOC_CURRCY,
InfoObject: 0DEBIT_DC Debit amount in foreign currency.
DEBIT_DC TYPE /BI0/OIDEBIT_DC,
InfoObject: 0CREDIT_DC Credit amount in foreign currency.
CREDIT_DC TYPE /BI0/OICREDIT_DC,
InfoObject: 0DEB_CRE_DC Foreign currency amount with signs (+/-).
DEB_CRE_DC TYPE /BI0/OIDEB_CRE_DC,
InfoObject: 0LOC_CURTP2 Currency Type of Second Local Currency.
LOC_CURTP2 TYPE /BI0/OILOC_CURTP2,
InfoObject: 0LOC_CURRC2 Second Local Currency.
LOC_CURRC2 TYPE /BI0/OILOC_CURRC2,
InfoObject: 0DEBIT_LC2 Debit Amount in 2nd Local Currency.
DEBIT_LC2 TYPE /BI0/OIDEBIT_LC2,
InfoObject: 0CREDIT_LC2 Credit Amount in Second Local Currency.
CREDIT_LC2 TYPE /BI0/OICREDIT_LC2,
InfoObject: 0DEB_CRE_L2 Amount in Second Local Currency with +/-
*Sign.
DEB_CRE_L2 TYPE /BI0/OIDEB_CRE_L2,
InfoObject: 0LOC_CURTP3 Currency Type of Third Local Currency.
LOC_CURTP3 TYPE /BI0/OILOC_CURTP3,
InfoObject: 0LOC_CURRC3 Third Local Currency.
LOC_CURRC3 TYPE /BI0/OILOC_CURRC3,
InfoObject: 0DEBIT_LC3 Debit Amount in Third Local Currency.
DEBIT_LC3 TYPE /BI0/OIDEBIT_LC3,
InfoObject: 0CREDIT_LC3 Credit Amount in Third Local Currency.
CREDIT_LC3 TYPE /BI0/OICREDIT_LC3,
InfoObject: 0DEB_CRE_L3 Amount in Third Local Currency with +/- S
*ign.
DEB_CRE_L3 TYPE /BI0/OIDEB_CRE_L3,
InfoObject: 0REF_DOC_NO Reference document number.
REF_DOC_NO TYPE /BI0/OIREF_DOC_NO,
InfoObject: 0REF_KEY3 Reference Key 3.
REF_KEY3 TYPE /BI0/OIREF_KEY3,
InfoObject: 0ORG_DOC_NO Document Number of Source Document.
ORG_DOC_NO TYPE /BI0/OIORG_DOC_NO,
InfoObject: 0ORG_DOC_YR Fiscal Year for Source Document Number.
ORG_DOC_YR TYPE /BI0/OIORG_DOC_YR,
InfoObject: 0ORG_DOC_CC Company Code for Source Document Number.
ORG_DOC_CC TYPE /BI0/OIORG_DOC_CC,
InfoObject: 0ORG_DOC_CO Controlling Area for Source Document Numb
*er.
ORG_DOC_CO TYPE /BI0/OIORG_DOC_CO,
InfoObject: 0POSTXT Item Text.
POSTXT TYPE /BI0/OIPOSTXT,
InfoObject: 0RECORDMODE BW Delta Process: Record Mode.
RECORDMODE TYPE RODMUPDMOD,
InfoObject: 0ALLOC_NMBR Allocation Number.
ALLOC_NMBR TYPE /BI0/OIALLOC_NMBR,
InfoObject: 0COPASLQTY Sales quantity.
COPASLQTY TYPE /BI0/OICOPASLQTY,
InfoObject: 0COPASLQTU Sales unit.
COPASLQTU TYPE /BI0/OICOPASLQTU,
InfoObject: ZTRANSTYP Transaction Type.
/BIC/ZTRANSTYP TYPE /BIC/OIZTRANSTYP,
InfoObject: 0FI_DBCRIND Debit/Credit Indicator.
FI_DBCRIND TYPE /BI0/OIFI_DBCRIND,
InfoObject: 0ASSET_MAIN Main Asset Number.
ASSET_MAIN TYPE /BI0/OIASSET_MAIN,
InfoObject: 0JV_RECIND Joint Venture Recovery Indicator.
JV_RECIND TYPE /BI0/OIJV_RECIND,
InfoObject: ZTBSETPER Settlement Period.
/BIC/ZTBSETPER TYPE /BIC/OIZTBSETPER,
Field: RECORD Data record number.
RECORD TYPE RSARECORD,
END OF tys_TG_1_full.
TYPES:
BEGIN OF tys_SC_1__RULE_63,
Field: SHKZG Debit/credit.
SHKZG TYPE C LENGTH 1,
Field: ZZ_QUAN Quantity.
ZZ_QUAN TYPE P LENGTH 7 DECIMALS 3,
END OF tys_SC_1__RULE_63.
Additional declaration for transfer rule interface
DATA:
g_t_errorlog TYPE rssm_t_errorlog_int,
RECORD_ALL LIKE SY-TABIX.
global definitions from transfer rules
TABLES: ...
DATA: ...
FORM compute_COPASLQTY
USING
RECORD_NO type sy-tabix
TRAN_STRUCTURE type tys_SC_1__RULE_63
CHANGING
RESULT TYPE tys_TG_1_full-COPASLQTY
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
DATA: l_s_errorlog TYPE rssm_s_errorlog_int.
If TRAN_STRUCTURE-SHKZG = 'S'.
RESULT = TRAN_STRUCTURE-ZZ_QUAN * -1.
ELSE.
RESULT = TRAN_STRUCTURE-ZZ_QUAN.
ENDIF.
returncode <> 0 means skip this record
RETURNCODE = 0.
abort <> 0 means skip whole data package !!!
ABORT = 0.
ENDFORM. "COPASLQTY
$$ end of 2nd part global - insert your code only before this line *
CLASS routine IMPLEMENTATION
CLASS lcl_transform IMPLEMENTATION.
METHOD compute_0COPASLQTY.
IMPORTING
request type rsrequest
datapackid type rsdatapid
SOURCE_FIELDS-SHKZG TYPE C LENGTH 000001
SOURCE_FIELDS-ZZ_QUAN TYPE P LENGTH 000007 DECIMALS 000003
EXPORTING
RESULT type tys_TG_1-COPASLQTY
DATA:
MONITOR_REC TYPE rsmonitor.
*$*$ begin of routine - insert your code only below this line *-*
Data:
l_s_error_log type rssm_s_errorlog_int,
TRAN_STRUCTURE type tys_SC_1__RULE_63,
l_subrc type sy-tabix,
l_abort type sy-tabix,
ls_monitor TYPE rsmonitor,
ls_monitor_recno TYPE rsmonitors.
REFRESH:
monitor,
monitor_recno.
Migrated transfer rule call
MOVE-CORRESPONDING SOURCE_FIELDS to TRAN_STRUCTURE.
*Perform compute_COPASLQTY*
*USING*
*SOURCE_FIELDS-record*
*TRAN_STRUCTURE*
*CHANGING*
*RESULT*
*l_subrc*
*l_abort.*
*-- Convert Messages in Transformation format
LOOP AT G_T_ERRORLOG INTO l_s_error_log.
move-CORRESPONDING l_s_error_log to MONITOR_REC.
append monitor_rec to MONITOR.
ENDLOOP.
IF l_subrc <> 0.
RAISE EXCEPTION TYPE cx_rsrout_skip_val.
ENDIF.
IF l_abort <> 0.
RAISE EXCEPTION TYPE CX_RSROUT_ABORT.
ENDIF.
$$ end of routine - insert your code only before this line -
ENDMETHOD. "compute_0COPASLQTY
Method invert_0COPASLQTY
This subroutine needs to be implemented only for direct access
(for better performance) and for the Report/Report Interface
(drill through).
The inverse routine should transform a projection and
a selection for the target to a projection and a selection
for the source, respectively.
If the implementation remains empty all fields are filled and
all values are selected.
METHOD invert_0COPASLQTY.
$$ begin of inverse routine - insert your code only below this line-
... "insert your code here
$$ end of inverse routine - insert your code only before this line -
ENDMETHOD. "invert_0COPASLQTY
ENDCLASS. "routine IMPLEMENTATIONHere was the orignal code. I've migrated the datasource now, and basically have started over using the 7.0 flow. We have a full 7.0 implemention, but the consultants chose not to migrate the Business Content, and built out the dataflow using 3.5 functionality. This routine is the only issue I'm having. It is flipping the sign. The above code is what BW did with the code when I created a transformation from the original rule.
PROGRAM CONVERSION_ROUTINE.
Type pools used by conversion program
TYPE-POOLS: RS, RSARC, RSARR, SBIWA, RSSM.
Declaration of transfer structure (selected fields only)
TYPES: BEGIN OF TRANSFER_STRUCTURE ,
InfoObject 0COMP_CODE: CHAR - 000004
BUKRS(000004) TYPE C,
InfoObject 0FISCPER: NUMC - 000007
FISCPER(000007) TYPE N,
InfoObject 0AC_DOC_NO: CHAR - 000010
BELNR(000010) TYPE C,
InfoObject 0ITEM_NUM: NUMC - 000003
BUZEI(000003) TYPE N,
InfoObject 0FI_DOCSTAT: CHAR - 000001
STATUSPS(000001) TYPE C,
InfoObject 0CHRT_ACCTS: CHAR - 000004
KTOPL(000004) TYPE C,
InfoObject 0GL_ACCOUNT: CHAR - 000010
HKONT(000010) TYPE C,
InfoObject 0ACCT_TYPE: CHAR - 000001
KOART(000001) TYPE C,
InfoObject 0SP_GL_IND: CHAR - 000001
UMSKZ(000001) TYPE C,
InfoObject 0AC_DOC_TYP: CHAR - 000002
BLART(000002) TYPE C,
InfoObject 0POST_KEY: CHAR - 000002
BSCHL(000002) TYPE C,
InfoObject 0FISCVARNT: CHAR - 000002
FISCVAR(000002) TYPE C,
InfoObject 0DOC_DATE: DATS - 000008
BLDAT(000008) TYPE D,
InfoObject 0PSTNG_DATE: DATS - 000008
BUDAT(000008) TYPE D,
InfoObject 0CREATEDON: DATS - 000008
CPUDT(000008) TYPE D,
InfoObject 0VALUE_DATE: DATS - 000008
VALUT(000008) TYPE D,
InfoObject 0CLEAR_DATE: DATS - 000008
AUGDT(000008) TYPE D,
InfoObject 0CLR_DOC_NO: CHAR - 000010
AUGBL(000010) TYPE C,
InfoObject 0CO_AREA: CHAR - 000004
KOKRS(000004) TYPE C,
InfoObject 0COSTCENTER: CHAR - 000010
KOSTL(000010) TYPE C,
InfoObject 0PROFIT_CTR: CHAR - 000010
PRCTR(000010) TYPE C,
InfoObject 0COORDER: CHAR - 000012
AUFNR(000012) TYPE C,
InfoObject : NUMC - 000008
PROJK(000008) TYPE N,
InfoObject 0PLANT: CHAR - 000004
WERKS(000004) TYPE C,
InfoObject 0BUS_AREA: CHAR - 000004
GSBER(000004) TYPE C,
InfoObject 0FUNC_AREA: CHAR - 000004
FKBER(000004) TYPE C,
InfoObject 0PART_PRCTR: CHAR - 000010
PPRCT(000010) TYPE C,
InfoObject 0PBUS_AREA: CHAR - 000004
PARGB(000004) TYPE C,
InfoObject 0PCOMPANY: CHAR - 000006
VBUND(000006) TYPE C,
InfoObject 0LOC_CURRCY: CUKY - 000005
LCURR(000005) TYPE C,
InfoObject 0DEBIT_LC: CURR - 000013
DMSOL(000007) TYPE P,
InfoObject 0CREDIT_LC: CURR - 000013
DMHAB(000007) TYPE P,
InfoObject 0DEB_CRE_LC: CURR - 000013
DMSHB(000007) TYPE P,
InfoObject 0DOC_CURRCY: CUKY - 000005
WAERS(000005) TYPE C,
InfoObject 0DEBIT_DC: CURR - 000013
WRSOL(000007) TYPE P,
InfoObject 0CREDIT_DC: CURR - 000013
WRHAB(000007) TYPE P,
InfoObject 0DEB_CRE_DC: CURR - 000013
WRSHB(000007) TYPE P,
InfoObject 0LOC_CURTP2: CHAR - 000002
CURT2(000002) TYPE C,
InfoObject 0LOC_CURRC2: CUKY - 000005
HWAE2(000005) TYPE C,
InfoObject 0DEBIT_LC2: CURR - 000013
DMSO2(000007) TYPE P,
InfoObject 0CREDIT_LC2: CURR - 000013
DMHA2(000007) TYPE P,
InfoObject 0DEB_CRE_L2: CURR - 000013
DMSH2(000007) TYPE P,
InfoObject 0LOC_CURTP3: CHAR - 000002
CURT3(000002) TYPE C,
InfoObject 0LOC_CURRC3: CUKY - 000005
HWAE3(000005) TYPE C,
InfoObject 0DEBIT_LC3: CURR - 000013
DMSO3(000007) TYPE P,
InfoObject 0CREDIT_LC3: CURR - 000013
DMHA3(000007) TYPE P,
InfoObject 0DEB_CRE_L3: CURR - 000013
DMSH3(000007) TYPE P,
InfoObject 0REF_DOC_NO: CHAR - 000016
XBLNR(000016) TYPE C,
InfoObject 0REF_KEY3: CHAR - 000020
XREF3(000020) TYPE C,
InfoObject 0ORG_DOC_NO: CHAR - 000010
AWREF(000010) TYPE C,
InfoObject 0ORG_DOC_YR: NUMC - 000004
AWGJA(000004) TYPE N,
InfoObject 0ORG_DOC_CC: CHAR - 000004
AWBUK(000004) TYPE C,
InfoObject 0ORG_DOC_CO: CHAR - 000004
AWKOK(000004) TYPE C,
InfoObject 0POSTXT: CHAR - 000050
SGTXT(000050) TYPE C,
InfoObject 0RECORDMODE: CHAR - 000001
UPDMOD(000001) TYPE C,
InfoObject 0WBS_ELEMT: CHAR - 000024
PS_POSID(000024) TYPE C,
InfoObject 0ALLOC_NMBR: CHAR - 000018
ZUONR(000018) TYPE C,
InfoObject 0COPASLQTU: UNIT - 000003
ZZ_MEINS(000003) TYPE C,
InfoObject 0COPASLQTY: QUAN - 000013
ZZ_QUAN(000007) TYPE P
DECIMALS 000003,
InfoObject ZTRANSTYP: CHAR - 000003
BEWAR(000003) TYPE C,
InfoObject 0FI_DBCRIND: CHAR - 000001
SHKZG(000001) TYPE C,
END OF TRANSFER_STRUCTURE .
Global code used by conversion rules
$$ begin of global - insert your declaration only below this line -
TABLES: ...
DATA: ...
$$ end of global - insert your declaration only before this line -
FORM COMPUTE_COPASLQTY
Compute value of InfoObject 0COPASLQTY
in communication structure /BIC/CS0FI_GL_4
Technical properties:
field name = COPASLQTY
data element = /BI0/OICOPASLQTY
data type = QUAN
length = 000017
decimals = 000003
ABAP type = P
ABAP length = 000009
reference field = 0COPASLQTU
Parameters:
--> RECORD_NO Record number
--> TRAN_STRUCTURE Transfer structure
<-- RESULT Return value of InfoObject
<-> G_T_ERRORLOG Error log
<-- RETURNCODE Return code (to skip one record)
<-- ABORT Abort code (to skip whole data package)
FORM COMPUTE_COPASLQTY
USING RECORD_NO LIKE SY-TABIX
TRAN_STRUCTURE TYPE TRANSFER_STRUCTURE
G_S_MINFO TYPE RSSM_S_MINFO
CHANGING RESULT TYPE /BI0/OICOPASLQTY
G_T_ERRORLOG TYPE rssm_t_errorlog_int
RETURNCODE LIKE SY-SUBRC
ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel datapackage
$$ begin of routine - insert your code only below this line -
DATA: l_s_errorlog TYPE rssm_s_errorlog_int.
If TRAN_STRUCTURE-SHKZG = 'S'.
RESULT = TRAN_STRUCTURE-ZZ_QUAN * -1.
ELSE.
RESULT = TRAN_STRUCTURE-ZZ_QUAN.
ENDIF.
returncode <> 0 means skip this record
RETURNCODE = 0.
abort <> 0 means skip whole data package !!!
ABORT = 0.
$$ end of routine - insert your code only before this line -
ENDFORM. -
Content conversion question for JMS adapter
Hi,
I need to put this again here.
I have the scenario R/3 IDoc -> XI -> MQ (webshpere). MQ requires plain text.
I have the IDoc ORDERS05 in multi level (nested in layers). But using the how to guid to convert the content I could go up to on level.
Based on the thread
Process Integration (PI) & SOA Middleware
It is not possible to use the conversion modules with XML2Plain key with parameter xml.conversionType value StructXML2Plain.
Please confirm if this is possible to convert a structure like below,
<ZSYSEX01>
.<IDOC BEGIN="1">
....<EDI_DC40 SEGMENT="SEGMENT0">
........<FIELD1>HEADER</FIELD1>
....</EDI_DC40>
....<E1STATS SEGMENT="SEGMENT1">
........<FIELD2>100</MANDT>
........<Z1HDSTAT SEGMENT="SEGMENT2">
...........<FIELD3>0200000716</FIELD3>
...........<Z1ITSTAT SEGMENT="SEGMENT3">
...............<FIELD4>1000</FIELD4>
...........</Z1ITSTAT>
........</Z1HDSTAT>
........<Z1HDSTAT SEGMENT="SEGMENT2">
...........<FIELD3>0200000717</FIELD3>
...........<Z1ITSTAT SEGMENT="SEGMENT3">
...............<FIELD4>1000</FIELD4>
...........</Z1ITSTAT>
...........<Z1ITSTAT SEGMENT="SEGMENT3">
...............<FIELD4>1001</FIELD4>
...........</Z1ITSTAT>
...........<Z1ITSTAT SEGMENT="SEGMENT3">
...............<FIELD4>1002</FIELD4>
...........</Z1ITSTAT>
........</Z1HDSTAT>
....</E1STATS>
.</IDOC>
</ZSYSEX01>
to plain text in JMS adapter.
Thanks!
JasonHi,
You have to use Content Converiosn in JMS Adapter Receiver side,
Module Sequence in the Receiver Channel
No. Module Name Module Key
1 localejbs/AF_Modules/MessageTransformBean XML2Plain
2 localejbs/SAP XI JMS Adapter/ConvertMessageToBinary CallJMSService
3 localejbs/SAP XI JMS Adapter/SendBinarytoXIJMSService Exit
Bases on Structure you have to configure the processing parameters,
please see the below link, it will helps you more
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f02d12a7-0201-0010-5780-8bfc7d12f891
Regrads
Chilla.. -
Refinement Panel show "file name" values with question mark instead of spaces
Hi,
I customized Refinement Panel to refine by file name. In some cases the value (file name) is shown with question mark instead of space. It looks like gibberish and the refine by that value doesn't bring the result.
Any ideas how to solve?
keren tsurHi Keren,
Please try to reset index in Central Administration > Application Management > Manage service applications > click the Search service application > Crawling > Index Reset > check the box Deactivate search
alerts during reset > Reset Now > Ok.
Then restart a full crawl in the Central Administration > Application Management > Manage service applications > click the Search service application > Crawling > Content Sources.
In addition, please capture a screenshot of the issue.
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected]
Regards,
Rebecca Tu
TechNet Community Support -
Magic Values - A Question of Development Approach
Hello folks,
I have a question for you PL/SQ developers out there. This isn't a specific problem or query I'm raising here, more a question of general approach. I'm probably not using the correct terms here so forgive me. Also, I've already posted this in the ApEx forum, however there's a degree of overlap into pure PL/SQL so I thought you were all bound to have experience of something along the same lines.
Anyhoo...
How to deal with Magic values - i.e. values which hold no intrinsic value in and of themselves, other than for state, process or conditional logic control. I use them quite a lot in my PL/SQL code (as I'm sure most developers do in one context or another).
From a Data architecture perspective, I'll generally have some sort of table for storing the 'facts': names, addresses etc, etc. Any application-specific magic values ('status', 'type') will be held as a foreign key in this table, which will reference a form of lookup table.
Example:
EMOTION
ID Description
== ===========
1 HAPPY
2 SAD
3 NEUTRAL
PERSON
NAME ... EMOTIONAL_STATE
==== ===============
BILL 1
JERRY 1
BRIAN 3
DONNA 2So far, so banal...
Now, say I have a process that needs to reference someone's emotional state for some sort of conditional logic:
declare
n_estate number;
begin
select emotional_state into n_estate
from Person
where name = 'BILL';
case when v_state = 1 then
-- do something
case when v_state = 2 then
-- do something else
else
-- otherwise something else again
end case;
end;straight away your bad code radar should be going crazy: you're coding literals in there! So, the old java programmer in me wants to store these as constants - I'll generally square them away inside a package somewhere, like so:
create or replace package PKG_CONSTANTS as
ES_HAPPY constant number:= 1;
ES_SAD constant number := 2;
ES_NEUTRAL constant number := 3;
end PKG_CONSTANTS;Thus the code becomes
Case when v_state = PKG_CONSTANTS.ES_HAPPY then ...Herein lies the crux of the issue. I'm effectively defining the same value twice: once in the lookup table (for data integrity) and once in the package. If new values are defined (say "Existential Ennui") or existing values are changed, I need to make sure the two are aligned, which hinders maintainability.
I thought about initialising the values as sort of pseudo-constants in the package initialise code but then you end up replacing one literal with another; you end up with code like:
create or replace package PKG_CONSTANTS as
ES_HAPPY number;
ES_SAD constant number;
ES_NEUTRAL constant number;
end PKG_CONSTANTS;
create or replace package body PKG_CONSTANTS as
rf_curs sys_refcursor;
begin
for rf_curs in
select ID
,description
from EMOTIONAL_STATE;
loop
case description
when 'HAPPY' then
ES_HAPPY := ID;
when 'SAD' then
ES_SAD := ID;
when 'NEUTRAL' then
ES_NEUTRAL := ID;
else
null;
end case;
end loop;
end PKG_CONSTANTS;I also thought about using dynamic PL/SQL to re-write and recompile the constants package in the event of a value being changed in the lookup table...seems like quite a lot of work, given that the magic value is pretty much meaningless outside of the scope of the application.
So... how to deal with this? What approach to you take? Does data integrity over-ride application programming style?
Any contributions would be welcome!Hello,
I had a look through the article (8 year's worth of thread? Sheesh that's dedication!) and yet it doesn't quite express exactly what I'm meaning. The argument there appears to be between dynamic SQL with bind variables versus static SQL. I'm not talking dynamically building queries or the use of bind variablers per se - its more related to how one makes use of magic values within the context of conditional logic and application code.
The example I chose happened to use a case statement, which maybe blurs the line with the syntax of pure SQL query and perhaps why you thought I was going down the dynamic SQL route, but I could just have easily replaced them with a series of 'if elsif else end' type expressions.
From an application developer point of view, the mantra of 'abstraction through constants' is the norm - referencing literals in expressions is generally frowned upon, with the possible exception of special numbers such as 1 or 0 (for incrementing counters, referring to the start of arrays etc, etc). One only has to look at the work of Feuerstein to see this - time and again in his books, the concept of delegating constant values (and subtypes) to well-defined areas (the "Single Source of Truth") rears it's head.
Now in the Oracle world, data architecture generally has primacy, which in this case manifests itself as the use of foreign keys in data tables referencing the equivalent lookups (dimensional modelling, star diagrams and the rest) - thus even special, application-specific values, i.e. with no intrinsic value in the real world, end up in your ERD. There appears to be a bit difference of opinon, depending on the background of the developer.
Hence my question - how do you, as developers, deal with these sorts of situations?
Maybe you are looking for
-
Smartforms,How to display different standard Text in a Text element.
In smartform,if I want to according to the different conditions,display one of the serveral different standard Text objects which all maintained in SO10 in a same text element,how to do it? if i put them in serval text elements, each condition only
-
I am looking to organize lyrics that are easy to pull up and check
-
Time Machine Recovery Partition
Hi, I am using an external HDD as my TimeMachine. I know that it had a Recovery Partition on it. But I seem to have lost it after I erased/reformatted it. i THINK I'd like it back: I mean it can't hurt, right? I just reformatted it and have redone my
-
Selected value in search help cannot be returned
Hi experts! When I select value in search help, selected value is not returned to the field. I did like this. 1. I enhanced BP using EEW. 2. I created a new view and display an enhanced field via BSP workbench. (an enhanced field is assigned to a che
-
IPhone USB audio only sometimes works
I decided to plug my Logitech H760 headphones into the USB adapter for my iPhone 4s. The music app works perfectly. The phone app however, does not. The microphone on the headset seems to be the input for the sound to a call, but the output is still