How to fill depended Z-Table, created by Rapid Application Tool, with initial data?
Hello gurus,
I created a component ZBOOKING with RAD Tool and will fill booking detail table with initial data during a creation of new booking objekt.
My Booking component looks like on the picture below. It was created with Rapid Application Tool and is based on one ZBOOKING Table with was created in CRM Backend and one “Booking details” Table with was created with Rapid Application.
My Question is how should I populate initial Template-Data to this dependent “booking details” Table wenn I will create a new booking object.
Web UI of ZBooking component
Component structure of ZBooking
I have tried to do sample implementation in do_init_context method of ZACL_CLASS00000C_IMPL class to fill ZBOOKING_DETAILS with some data.
method DO_INIT_CONTEXT.
CALL METHOD SUPER->DO_INIT_CONTEXT.
DATA: lr_col TYPE REF TO if_bol_bo_col,
lr_valuenode TYPE REF TO cl_bsp_wd_value_node,
lr_template TYPE REF TO ZBOOKING_DETAIL.
CREATE DATA lr_template.
CREATE OBJECT lr_valuenode
EXPORTING
IV_DATA_REF = lr_template.
if lr_valuenode is BOUND.
lr_valuenode->SET_PROPERTY_AS_STRING(
iv_attr_name = 'ZZVISITDATE' "#EC NOTEXT
iv_value = '01.01.2014'
ENDIF.
CREATE OBJECT lr_col TYPE cl_crm_bol_bo_col.
lr_col->ADD(
exporting
iv_entity = lr_entity
IV_SET_FOCUS = ABAP_TRUE
me->TYPED_CONTEXT->ZBOOKING_DETAIL->SET_COLLECTION( lr_col ).
endmethod.
After this implementation the initial data is displayed in “Booking details”.
However if I press an Edit-List-button (Assignment block) I get an error “CUST Operation”.
May be I should fill this depented table ZBOOKING_DETAILS without cl_bsp_wd_value_node?
Where should I implement logic if I want to fill initial data just during creation of booking object and not every
time when the booking object is called or initialized?
Regards Dmitry
Hi,
I get an error "Dereferencing of the NULL reference" during created_related_entity.
If you see on the picture below I think that I use right relation name for my zbooking_detail table.
method DO_INIT_CONTEXT.
CALL METHOD SUPER->DO_INIT_CONTEXT.
data : lr_leading_entity TYPE REF TO cl_crm_bol_entity,
lr_col TYPE REF TO if_bol_bo_col.
lr_leading_entity->create_related_entity(
iv_relation_name = 'ZAET_CA_ATAB000000' ).
lr_leading_entity->SET_PROPERTY_AS_STRING(
iv_attr_name = 'ZZVISITDATE' "#EC NOTEXT
iv_value = '01.01.2014'
lr_col->ADD(
exporting
iv_entity = lr_leading_entity
IV_SET_FOCUS = ABAP_TRUE
me->TYPED_CONTEXT->ZBOOKING_DETAIL->SET_COLLECTION( lr_col ).
endmethod.
Many Thanks & Regards,
Dmitry
Similar Messages
-
How do I create an interactive PDF file with variable data
We would like to basically do a 'mail merge' of our list of customers with an interactive PDF file (including videos, menus, etc - not just form fill out and web links) to create a single PDF file that contains multiple mail pieces ... one for each customer ... with each mail piece being customized for that customer. Customizations would include different greetings (Dear Bob, Dear Dana, etc), as well as different charts based on data unique to the customer, different photographs, etc.
I've seen that InDesign and Acrobat Professional can be used to create an interactive PDF (such as from http://tv.adobe.com/watch/ask-the-adobe-ones/14-calling-rufus-about-interactive-pdf-making). However I don't understand how I can insert data from a database, csv file, excel file etc into the PDF file so that each page, or each set of pages, within the PDF can be customized.
Can anyone point me to a tool to use for this?
Thanks,
Bob KendallFor that kind of volume and unattended operation, you want InDesign Server – which is the server/high volume edition of INDD.
From: Adobe Forums <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>>
Date: Thu, 3 Nov 2011 06:58:07 -0700
To: Leonard Rosenthol <[email protected]<mailto:[email protected]>>
Subject: How do I create an interactive PDF file with variable data
Re: How do I create an interactive PDF file with variable data
created by Ti26E31DSxxx<http://forums.adobe.com/people/Ti26E31DSxxx> in PDF Language and Specifications - View the full discussion<http://forums.adobe.com/message/4005459#4005459 -
How to fill a hashed table ?
Hi,
In my last thread i had asked about the ways of deleting the cube contents selectively using a job/FM and i was suggested this by one of you.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = 'YOUR_CUBE'
I_THX_SEL = L_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
CHANGING
C_T_MSG = L_T_MSG.
Although the FM is the correct one the structure L_THX_SEL is a hashed table structure and i am not aware how to fill values into it. My requirement is to give a condition for the 0CALDAY info-object i.e 0CALDAY < 30 days. Please suggest me.
Regadrs,
Pramod MHI,
Internal Tables
Internal tables provide a means of taking data from a fixed structure and storing it in working memory in ABAP. The data is stored line by line in memory, and each line has the same structure. In ABAP, internal tables fulfill the function of arrays. Since they are dynamic data objects, they save the programmer the task of dynamic memory management in his or her programs. You should use internal tables whenever you want to process a dataset with a fixed structure within a program. A particularly important use for internal tables is for storing and formatting data from a database table within a program. They are also a good way of including very complicated data structures in an ABAP program.
Data Type of an Internal Table
The data type of an internal table is fully specified by its line type, key, and table type.
Line Type
The line type of an internal table can be any data type. The data type of an internal table is normally a structure. Each component of the structure is a column in the internal table. However, the line type may also be elementary or another internal table.
Key
The key identifies table rows. There are two kinds of key for internal tables - the standard key and a user-defined key. You can specify whether the key should be UNIQUE or NON-UNIQUE. Internal tables with a unique key cannot contain duplicate entries. The uniqueness depends on the table access method.
At tables with structured row type, the standard key is formed from all character-type columns of the internal table. If a table has an elementary line type, the default key is the entire line. The default key of an internal table whose line type is an internal table, the default key is empty. At tables with non-structured row type, the standard key consists of the entire row. If the row type is also a table, an empty key is defined.
The user-defined key can contain any columns of the internal table that are no internal table themselves, and do not contain internal tables. References are allowed as table keys. Internal tables with a user-defined key are called key tables. When you define the key, the sequence of the key fields is significant. You should remember this, for example, if you intend to sort the table according to the key.
Table type
The table type determines how ABAP will access individual table entries. Internal tables can be divided into three types:
Standard tables have an internal linear index. From a particular size upwards, the indexes of internal tables are administered as trees. In this case, the index administration overhead increases in logarithmic and not linear relation to the number of lines. The system can access records either by using the table index or the key. The response time for key access is proportional to the number of entries in the table. The key of a standard table is always non-unique. You cannot specify a unique key. This means that standard tables can always be filled very quickly, since the system does not have to check whether there are already existing entries.
Sorted tables are always saved sorted by the key. They also have an internal index. The system can access records either by using the table index or the key. The response time for key access is logarithmically proportional to the number of table entries, since the system uses a binary search. The key of a sorted table can be either unique or non-unique. When you define the table, you must specify whether the key is to be UNIQUE or NON-UNIQUE. Standard tables and sorted tables are known generically as index tables.
Hashed tables have no linear index. You can only access a hashed table using its key. The response time is independent of the number of table entries, and is constant, since the system access the table entries using a hash algorithm. The key of a hashed table must be unique. When you define the table, you must specify the key as UNIQUE.
Generic Internal Tables
Unlike other local data types in programs, you do not have to specify the data type of an internal table fully. Instead, you can specify a generic construction, that is, the key or key and line type of an internal table data type may remain unspecified. You can use generic internal tables to specify the types of field symbols and the interface parameters of procedures . You cannot use them to declare data objects.
Internal Tables as Dynamic Data Objects
Internal tables are always completely specified regarding row type, key and access type. However, the number of lines is not fixed. Thus internal tables are dynamic data objects, since they can contain any number of lines of a particular type. The only restriction on the number of lines an internal table may contain are the limits of your system installation. The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes. An additional restriction for hashed tables is that they may not contain more than 2 million entries. The line types of internal tables can be any ABAP data types - elementary, structured, or internal tables. The individual lines of an internal table are called table lines or table entries. Each component of a structured line is called a column in the internal table.
Choosing a Table Type
The table type (and particularly the access method) that you will use depends on how the typical internal table operations will be most frequently executed.
Standard tables
This is the most appropriate type if you are going to address the individual table entries using the index. Index access is the quickest possible access. You should fill a standard table by appending lines (ABAP APPENDstatement), and read, modify and delete entries by specifying the index (INDEX option with the relevant ABAP command). The access time for a standard table increases in a linear relationship with the number of table entries. If you need key access, standard tables are particularly useful if you can fill and process the table in separate steps. For example, you could fill the table by appending entries, and then sort it. If you use the binary search option (BINARY) with key access, the response time is logarithmically proportional to the number of table entries.
Sorted tables
This is the most appropriate type if you need a table which is sorted as you fill it. You fill sorted tables using the INSERTstatement. Entries are inserted according to the sort sequence defined through the table key. Any illegal entries are recognized as soon as you try to add them to the table. The response time for key access is logarithmically proportional to the number of table entries, since the system always uses a binary search. Sorted tables are particularly useful for partially sequential processing in a LOOP if you specify the beginning of the table key in the WHEREcondition.
Hashed tables
This is the most appropriate type for any table where the main operation is key access. You cannot access a hashed table using its index. The response time for key access remains constant, regardless of the number of table entries. Like database tables, hashed tables always have a unique key. Hashed tables are useful if you want to construct and use an internal table which resembles a database table or for processing large amounts of data.
If help full please give me max Reward -
How to fill pre-defined tables in Word 2003 with test results?
Hi All,
I have a simple test board , where i measure different DC levels, with a simple Vi that saves the test results in to a Text file.
These results in text file, were manually typed in to a test report , which is a Word document with tables, but i have now created a macro in word 2003 that can take the results from text file and added to the specific rows and columns in my word test report file.
The big questions is how can i get Labview to,after it has measured the DC levels, write the test results to the specific tables , rows and columns, in the Word document without deleting all?
The attached picture illustrates one tables , and the yellow cells showing the cells that need to be filled in with test results.
Labview vers: 2009
Solved!
Go to Solution.
Attachments:
Template.jpg 37 KBHi!
I'm a Test Engineer for electronics and I create alot of automated tests with LabVIEW. Many of our customers want testsresults in a fancy and easy-to-read report. I did not have the MS Office Report Generation tool. So I did some research in the Excel Board and came a cross a link to a free open Excel toolkit ( Link ). The toolkit looks like this in LabVIEW:
As you can see, you have many tools to get the job done. I'm talking about reading, writing and formatting Excel. The toolkit uses activeX. If you give me a few minutes I'll create an example for you.
What I do when I create a report:
Step 1. Create a template with all your formatting, macros, images and colors. Name the file for example "template.xlsx" (or .xls if you use an older office version).
Step 2. In your program, copy the file to the desired location and give it a proper name.
Step 3. Aquire some data and write it to the file you copied and renamed.
For my latest project I created a fully-automated testprogram for testing a electric product. After each test a report is generated based on the procedure above. After testing all the products I have for example 300 reports. I then created a program that reads all the reports and creates a summary report that contains statistics for all the tested products. In my program I can also print the reports or the summary report and deliver to our customer. All this takes just a few seconds and adds extra quality to the products we sell to our customer.
Regards,
Even
Certified LabVIEW Associate Developer
Automated Test Developer
Topro AS
Norway -
How to fill a hashed table structure
Hi,
In my last thread i had asked about the ways of deleting the cube contents selectively using a job/FM and i was suggested this by one of you.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = 'YOUR_CUBE'
I_THX_SEL = L_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
CHANGING
C_T_MSG = L_T_MSG.
Although the FM is the correct one the structure L_THX_SEL is a hashed table structure and i am not aware how to fill values into it. My requirement is to give a condition for the 0CALDAY info-object i.e 0CALDAY < 30 days. Please suggest me.
Regadrs,
Pramod MHello,
I will recommend you to use the program 'RSDRD_DELETE_FACTS'. This is easier and is generated for individual cubes and u can specify condition for the same with help of variants. Just follow the below steps
Goto SA38 and execute the program RSDRD_DELETE_FACTS. Then give your cube name and select the generate selection program radiobutton and give the name for the program. Execute and the program for selective deletion will be generated for that cube.
Now goto SE38 and select ur generated program for editing. You will see at the end that individual characteristic have a set of code as below:
IF NOT C011[] IS INITIAL.
CLEAR L_SX_SEL.
L_SX_SEL-IOBJNM = '0CALDAY'.
LOOP AT C011 .
CLEAR L_S_RANGE.
MOVE C011-SIGN TO L_S_RANGE-SIGN.
MOVE C011-OPTION TO L_S_RANGE-OPTION.
MOVE C011-LOW TO L_S_RANGE-LOW.
MOVE C011-HIGH TO L_S_RANGE-HIGH.
MOVE RS_C_TRUE TO L_S_RANGE-KEYFL.
APPEND L_S_RANGE TO L_SX_SEL-T_RANGE.
ENDLOOP.
INSERT L_SX_SEL INTO TABLE L_THX_SEL.
ENDIF.
You need to search for 0CALDAY Code. It will look like above.
Now above all this IFs for characteristics, you can write the code for selective deletion.
The code will be like below.
DATA: date TYPE D,
date1 TYPE D,
RUN_MN1 TYPE /BI0/OICALDAY,
LE_CO11 LIKE LINE OF C011.
date = sy-datum.
date1 = date - 30.
RUN_MN1 = date1.
LE_CO11-SIGN = 'I'.
LE_CO11-OPTION = 'LT'.
LE_CO11-LOW = RUN_MN1.
CLEAR C011.
APPEND LE_CO11 TO C011.
Hope this helps.
Regds,
Shashank -
How to fill set up tables with out missing the delta records
Hi,
I would like fill set up tables in the productioon system of apllication of logistics.
Can you please guide me how do we perform.?
What are points to be considered?
Because,when i start the filling set up table by 10.AM if there are any posting at 10:05,10:06....like that
how can collect them i.e will i miss any records in second delta run?What setps to be taken care?
Thanks in advance
Naresh.Hi.
You can fill the set-up tables during normal operation hours ,if you load the data into ODS and the update queue is 'Queued delta' .Downtime is needed to avoid the duplicates .But if you use 'Direct delta' you miss the delta documents. Hence it is better to go for downtime approach for this case.
Initially your delta records will be stored in the extraction queue and then when you run the collective job, records will be moved into delta queue. You can run the collective job (LBWE) anytime after the init run.If you need a daily delta ,then schedule this job before the delta loading. You can schedule this job either hrly or daily .This will move your records into delta queue. At the time of delta loading ,all your delta queue records will be moved into BW .
Thanks. -
How to fill depended collection of data when selecting a value from lov
hi
i am working on a master detail form. when i am selecting a Department name from LOV all
corresponding Employee will filled in to table. how it is possible programatically
caan u give some sample code
Regards
Rajeshhi
i am using J developer 11g (adf)and i used an lov for searching for departments
when i selecting a departments ,all emplyees from that department i want to fill
in a table.
here i can fill data in iin input text control . But i cannot fill array of data into a table
can u give some advice and sample code
regards
rajesh -
How to find names of tables created by specific user
I have tried;
select * from dba_tables
and
select * from users_tables
and it shows 1600 + and 900+ tables resp. I just need to find out the tables created by me.
any help or guidance is greatly appreciated.Hi,
Welcome to the forum. This may help you.
select * from dba_objects where object_type='TABLE' and owner = <user>
select * from user_objects where object_type='TABLE'cheers
VT -
How many extents allocated when table created?
I am using Oracle 9,
is the number going to be what we specified by minextents?
thanksSrinivas,
You said,
If its AUTOALLOCATE , Oracle starts with 1 extent of 64KB , then 128KB as the first extent becomes full, then 256KB so on....
Can you help me in understanding this statement?I don't think that its true. See here,
SQL> select * from V$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
SQL> drop tablespace test including contents and tablespaces;
drop tablespace test including contents and tablespaces
ERROR at line 1:
ORA-00905: missing keyword
SQL> drop tablespace test including contents and datafiles;
Tablespace dropped.
SQL> create tablespace test datafile 'd:\test.dbf' size 100m extent
ocal autoallocate ;
Tablespace created.
SQL> select tablespace_name,initial_extent,next_extent from dba_tab
2 where tablespace_name='TEST'/
3
SQL> select tablespace_name,initial_extent,next_extent from dba_tab
2 where tablespace_name='TEST'
3 /
TABLESPACE_NAME INITIAL_EXTENT NEXT_EXTENT
TEST 65536
SQL> --Creating a table inside in this tablespace
SQL> create table t as select * from dba_objects;
Table created.
SQL> alter table t move tablespace test;
Table altered.
SQL> select tablespace_name, extent_id, bytes/1024, blocks
2 from user_extents
3 where segment_name = 'T';
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 0 64 8
TEST 1 64 8
TEST 2 64 8
TEST 3 64 8
TEST 4 64 8
TEST 5 64 8
TEST 6 64 8
TEST 7 64 8
TEST 8 64 8
TEST 9 64 8
TEST 10 64 8
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 11 64 8
TEST 12 64 8
TEST 13 64 8
TEST 14 64 8
TEST 15 64 8
TEST 16 1024 128
TEST 17 1024 128
TEST 18 1024 128
TEST 19 1024 128
TEST 20 1024 128
21 rows selected.
SQL>
SQL> insert into t select * from t;
50356 rows created.
SQL> /
100712 rows created.
SQL> /
201424 rows created.
SQL> /
402848 rows created.
SQL> commit;
Commit complete.
SQL> analyze table t compute statistics;
Table analyzed.
SQL> select tablespace_name, extent_id, bytes/1024, blocks
2 from user_extents
3 where segment_name = 'T';
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 0 64 8
TEST 1 64 8
TEST 2 64 8
TEST 3 64 8
TEST 4 64 8
TEST 5 64 8
TEST 6 64 8
TEST 7 64 8
TEST 8 64 8
TEST 9 64 8
TEST 10 64 8
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 11 64 8
TEST 12 64 8
TEST 13 64 8
TEST 14 64 8
TEST 15 64 8
TEST 16 1024 128
TEST 17 1024 128
TEST 18 1024 128
TEST 19 1024 128
TEST 20 1024 128
TEST 21 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 22 1024 128
TEST 23 1024 128
TEST 24 1024 128
TEST 25 1024 128
TEST 26 1024 128
TEST 27 1024 128
TEST 28 1024 128
TEST 29 1024 128
TEST 30 1024 128
TEST 31 1024 128
TEST 32 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 33 1024 128
TEST 34 1024 128
TEST 35 1024 128
TEST 36 1024 128
TEST 37 1024 128
TEST 38 1024 128
TEST 39 1024 128
TEST 40 1024 128
TEST 41 1024 128
TEST 42 1024 128
TEST 43 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 44 1024 128
TEST 45 1024 128
TEST 46 1024 128
TEST 47 1024 128
TEST 48 1024 128
TEST 49 1024 128
TEST 50 1024 128
TEST 51 1024 128
TEST 52 1024 128
TEST 53 1024 128
TEST 54 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 55 1024 128
TEST 56 1024 128
TEST 57 1024 128
TEST 58 1024 128
TEST 59 1024 128
TEST 60 1024 128
TEST 61 1024 128
TEST 62 1024 128
TEST 63 1024 128
TEST 64 1024 128
TEST 65 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 66 1024 128
TEST 67 1024 128
TEST 68 1024 128
TEST 69 1024 128
TEST 70 1024 128
TEST 71 1024 128
TEST 72 1024 128
TEST 73 1024 128
TEST 74 1024 128
TEST 75 1024 128
TEST 76 1024 128
TABLESPACE_NAME EXTENT_ID BYTES/1024 BLOCKS
TEST 77 1024 128
TEST 78 1024 128
TEST 79 8192 1024
TEST 80 8192 1024
TEST 81 8192 1024
82 rows selected.
SQL>Its not working in the way youmentioned. The extents are of 65kb till 16 extents than it changes to 1024kb untill 78 and then 8192 kb. Is it something that I am missing?
Aman.... -
How to effectively manage large table which is rapidly growing
All,
My environment is single node database with regular file system.
Oracle - 10.2.0.4.0
IBM - AIX
A tablespace in this database is growing rapidly. Especially a single table in that tablespace having "Long Raw" column datatype has grown from 4 GBs to 900 GBs in 6 months.
We had discussion with application team and they mentioned that due to acquisitions, data volume is increased and we are expecting it to grow up to 4 TBs in next 2 years.
In order to effectively manage the table and to avoid performance issues, we are looking for different options as below.
1) Table is having date column. With that thought of converting to partitioned table like "Range" partitioning. I never converted a table of 900 GBs to a partitioned table. Is it a best method?
a) how can I move the data from regular table to partitioned table. I looked into google, but not able to find good method for converting to regular table to partitioned table. Can you help me out / share best practices?
2) In one of the article, I read, BLOB is better than "Long RAW" datatype, how easy is to convert from "Long RAW" datatype. Will BLOB yield better performance and uses disk space effectively?
3) Application team is having purging activity based on application logic. We thought of using shrinking of tables option with enable row movement- "alter table <table name> shrink space cascade". But it is returning the error that table contains "Long" datatype. Any suggestions.
Any other methods / suggestions to handle this situation effectively..
Note: By end of 2010, we have plans of moving to RAC with ASM.
ThanksTo answer your question 2:
2) In one of the article, I read, BLOB is better than "Long RAW" datatype,
how easy is to convert from "Long RAW" datatype. Will BLOB yield better
performance and uses disk space effectively?Yes, LOBs, BLOBs, or CLOBs are (supposed) to be better than raws (or long raws). In addition, I believe Oracle has or will shortly be desupporting the use of long raws in favor of LOBs, CLOBs, or BLOBs (as appropriate).
There is a function called "to_lob" that you have to use to convert. Its a pain because you have to create the second table and then insert into the second table from the first table using the 'to_lob' function.
from my notes...
=================================================
Manually recreate the original table...
Next, recreate (based on describe of the table), except using CLOB instead of LONG:
SQL> create table SPACER_STATEMENTS
2 (OWNER_NAME VARCHAR2(30) NOT NULL,
3 FOLDER VARCHAR2(30) NOT NULL,
4 SCRIPT_ID VARCHAR2(30) NOT NULL,
5 STATEMENT_ID NUMBER(8) NOT NULL,
6 STATEMENT_DESC VARCHAR2(25),
7 STATEMENT_TYPE VARCHAR2(10),
8 SCRIPT_STATEMENT CLOB,
9 ERROR VARCHAR2(1000),
10 NUMBER_OF_ROWS NUMBER,
11 END_DATE DATE
12 )
13 TABLESPACE SYSTEM
14 ;
Table created.
Try to insert the data using select from original table...
SQL> insert into SPACER_STATEMENTS select * from SPACER_STATEMENTS_ORIG;
insert into SPACER_STATEMENTS select * from SPACER_STATEMENTS_ORIG
ERROR at line 1:
ORA-00997: illegal use of LONG datatype
That didn't work...
Now, lets use TO_LOB
SQL> insert into SPACER_STATEMENTS
2 (OWNER_NAME, FOLDER, SCRIPT_ID, STATEMENT_ID, STATEMENT_DESC, STATEMENT_TYPE, SCRIPT_STATEMENT, ERROR, NUMBER_OF_ROWS, END_DATE)
3 select OWNER_NAME, FOLDER, SCRIPT_ID, STATEMENT_ID, STATEMENT_DESC, STATEMENT_TYPE, TO_LOB(SCRIPT_STATEMENT), ERROR, NUMBER_OF_ROWS, END_DATE
4 from SPACER_STATEMENTS_ORIG;
10 rows created.
works well...
=============================================================== -
I had an interview question that is:
How to update a table (Customer) on a server ex: Report Server with the data from the same table (Customer) From another server ex: Transaction server?
Set up steps so inset, update or delete operation takes place across the servers.
It would be great if someone please enlighten me in details about this process in MS SQL Server 2008 R2.
Also please describe would it be different for SQL Server 2012?
If so, then what are the steps?I had an interview question that is:
How to update a table (Customer) on a server ex: Report Server with the data from the same table (Customer) from another server ex: Transaction server?
Set up steps so that inset, update or delete operation gets done correctly across servers.
I was not sure about the answer, it would be great if someone please put some light on this and explain in details about this process in MS SQL Server 2008 R2.
Also it would be very helpful if you please describe would it be different for SQL Server 2012? If so, then what are the steps? -
How to fill the details tab in the iPad tv shows with description
Can anyone tell how to fill out the details tab metadata in the tv shows section of iPad? The tv shows purchased from the iTunes Store have the details tab beside the episodes tab filled in with helpful description about the show. Isn't there a way to do that manually for personally downloaded tv shows? It's not possible to buy every content from the iTunes Store. Please help if anyone know how to add description to tv show details.
You should be able to do it on your computer's iTunes - select the TV programme in the TV Shows part of your iTunes library, do 'get-info' (control-I) on it, and on the popup select the Video tab and enter what you want to have in its Description field and then sync that updated version to your iPad.
-
Creating a dynamic nested menu with xml data received from a webservice
I need to create a dynamic menu based on a xml returned by a webservice.
the xml comes basically in this format:
[quote]
<resposta>
<status>Success</status>
<mensagem>Whatever</mensagem>
<dados>
<projeto nome="name" cliente="client name">
<atividade nome="name">
<etapa>
<nome>name</nome>
<other_attributes>...</other_attributes>
</etapa>
(other etapas)
</atividade>
(other atividades)
</projeto>
(other projetos)
</dados>
</resposta>
[/quote]
What I need is to create a menu like:
- Projeto.Nome - Projeto.Cliente:
- Atividade.nome:
(start button) etapa1
(start button) etapa2
- Projeto2.Nome - Projeto2.Cliente:
- Atividade.nome:
(start button) etapa1
(start button) etapa2
And so on...
I've tried using an HTTPService and a DataGroup, this code above works fine for display the projeto's names:
[quote]
<s:HTTPService id="loginService"
url="http://timesheet.infinitech.local/services"
method="POST" contentType="application/xml"
result="handleLoginResult();"
fault="handleFault(event);" >
<s:request xmlns="">
<requisicao>
<tipo>login</tipo>
<usuario>{campoUsuario.text}</usuario>
<senha>{campoSenha.text}</senha>
</requisicao>
</s:request>
</s:HTTPService>
and the DataGroup:
<s:DataGroup dataProvider="{tarefasService.lastResult.resposta.dados.projeto}" width="100%" y="100" x="20"
includeIn="Principal">
<s:layout>
<s:VerticalLayout />
</s:layout>
<s:itemRenderer>
<fx:Component>
<s:ItemRenderer>
<s:layout>
<s:HorizontalLayout />
</s:layout>
<s:Button />
<s:Label text="{data.nome}" />
</s:ItemRenderer>
</fx:Component>
</s:itemRenderer>
</s:DataGroup>
[/quote]
I have then tried including another datagroup inside the datagroup item renderer, but I just couldn't get it to work anyway, and tried it in a lot of ways... (basically, it would be a datagroup with dataProvider={data.atividade}).
Can anyone tell me how to get this to work?
I've uploaded an example xml, you can use it as the url for the HTTPService:
http://www.pdinfo.com.br/example.xml
Thanks in advance.Hi,
A lot of the information you need is in Adobe's scripting guide http://www.adobe.com/go/learn_lc_scriptingReference Also there is a very useful Adobe guide to Calculations and Scripts (and while it is for version 6 it is still very good because of the way it is laid out) http://partners.adobe.com/public/developer/en/tips/CalcScripts.pdf
The Javascript could be used in the Layout: Ready event.
For italic font:
if (...some test...)
this.font.posture = "italic";
else
this.font.posture = "normal";
For bold font:
if (...some test...)
this.font.weight = "bold";
else
this.font.weight = "normal";
The script will change the font for the complete field. I don't think you can change parts of a field.You can also change font colour and font type (the guides above will help).
Good luck,
Niall -
How to fill a sparse table with zero
Hi All
I have a sparse table, most of the cells is null, but and few of the cells is "1"
I would like to fill the null with zero.
This table is from pivoting a transactional table into a table that will describe the attributes and later on for data mining
I am thinking of
1) do a user_tab_columns and copy and paste make a script that use NVL
for e.g.
select 'NVL('||column_name||',0,'||column_name||'),' from user_tab_columns
where lower(table_name) = 'claims_t1'
but I run into an issue
e.g.
create or replace view claims_t2x
as
select
NVL('Diagnostic Imaging'_SPEC_SUM,0,'Diagnostic Imaging'_SPEC_SUM) 'Diagnostic Imaging'_SPEC_SUM
from
claims_t1
I keep getting error of ORA-00911: invalid character
as the column name got "'" in it.
or
2)Use PL/SQL
I just do a select and loop through the whole thing
For 2), I am not sure how I can get all the column attribute.
As the attribute in the transaction table is not fix so the number of column after pivoting is not fix
any idea?
-Thanks so much for your input.
Edited by: xwo0owx on Apr 27, 2011 11:08 AM>
pivoting A transactional TABLE INTO A TABLE that will describe THE ATTRIBUTES AND later ON FOR DATA mining
>
You should have created all the columns with DEFAULT 0 NOT NULL;
Why doing all this every time. Why not generate the table itself like that?
what do you mean by a fill? you want to update them?
then loop through and update like this,
DECLARE
p_tab_name VARCHAR2 (100) := 'your_table_name';
l_sql VARCHAR2 (1000);
BEGIN
FOR i IN (SELECT *
FROM user_tab_columns
WHERE table_name = p_tab_name
AND datatype IN ('VARCHAR2', 'NUMBER', 'CHAR'))
LOOP
l_sql :=
'update '
|| p_tab_name
|| ' set '
|| i.column_name
|| '=0 where '
|| i.column_name
|| ' is null';
EXECUTE IMMEDIATE l_sql;
END LOOP;
END;G. -
How could i updated a table created with 5 tables.
Hi everyone,
This is my problem. I have five "tables" and each one contains one row and 7 columns. In the other hand, I have one table(Ireland1) that it retrieves the values of these 5 "tables" through of " insert into" statement.
How would be I able to update of "Ireland1" data, when one of this "tables" (Remenber, I have 5 tables) has changed?
I have been searching information about this but all my search has been fruitless.
Thanks in advance,
From IrelandHi Eric,
Thank you for your quick reply and solution. I have run your statement and appear this error:
Msg 156, Level 15, State 1, Procedure Non_current_assets_Historic_View, Line 426
Incorrect syntax near the keyword 'SELECT'.
I dont know why this error appear
I leave my Sript practically full
USE Aerospace
GO
--TABLE NON-CURRENT ASSETS
IF OBJECT_ID('Non_current_assets_Historic') IS NOT NULL
DROP TABLE Non_current_assets_Historic
GO
CREATE TABLE Non_current_assets_Historic
[IdCuenta] [float] NOT NULL,
[NameCuenta] [nvarchar](255) NULL,
Year_2006 decimal (14,2) NULL,
Year_2007 decimal (14,2) NULL,
Year_2008 decimal (14,2) NULL,
Year_2009 decimal (14,2) NULL,
Year_2010 decimal (14,2) NULL,
Year_2011 decimal (14,2) NULL,
Year_2012 decimal (14,2) NULL,
Year_2013 decimal (14,2) NULL,
Year_2014 decimal (14,2) NULL,
Dif_2007_2006 decimal (14,2) NULL,
Dif_2008_2007 decimal (14,2) NULL,
Dif_2009_2008 decimal (14,2) NULL,
Dif_2010_2009 decimal (14,2) NULL,
Dif_2011_2010 decimal (14,2) NULL,
Dif_2012_2011 decimal (14,2) NULL,
Dif_2013_2012 decimal (14,2) NULL,
Dif_2014_2013 decimal (14,2) NULL,
AHP_2007_2006 decimal (14,2) NULL,
AHP_2008_2007 decimal (14,2) NULL,
AHP_2009_2008 decimal (14,2) NULL,
AHP_2010_2009 decimal (14,2) NULL,
AHP_2011_2010 decimal (14,2) NULL,
AHP_2012_2011 decimal (14,2) NULL,
AHP_2013_2012 decimal (14,2) NULL,
AHP_2014_2013 decimal (14,2) NULL,
GO
ALTER TABLE Non_current_assets_Historic
ADD CONSTRAINT PK_Non_current_assets_Historic PRIMARY KEY (IdCuenta)
GO
UPDATE Non_current_assets_Historic SET Year_2006=0 WHERE Year_2006 IS NULL
UPDATE Non_current_assets_Historic SET Year_2007=0 WHERE Year_2007 IS NULL
UPDATE Non_current_assets_Historic SET Year_2008=0 WHERE Year_2008 IS NULL
UPDATE Non_current_assets_Historic SET Year_2009=0 WHERE Year_2009 IS NULL
UPDATE Non_current_assets_Historic SET Year_2010=0 WHERE Year_2010 IS NULL
UPDATE Non_current_assets_Historic SET Year_2011=0 WHERE Year_2011 IS NULL
UPDATE Non_current_assets_Historic SET Year_2012=0 WHERE Year_2012 IS NULL
UPDATE Non_current_assets_Historic SET Year_2013=0 WHERE Year_2013 IS NULL
UPDATE Non_current_assets_Historic SET Year_2014=0 WHERE Year_2014 IS NULL
GO
INSERT INTO Non_current_assets_Historic
SELECT *
FROM Property_plant_equipment_Historic
INSERT INTO Non_current_assets_Historic
SELECT *
FROM Intangible_assets_Historic
INSERT INTO Non_current_assets_Historic
SELECT *
FROM Available_financial_assets_Historic
INSERT INTO Non_current_assets_Historic
SELECT *
FROM Deferred_tax_assets_Historic
INSERT INTO Non_current_assets_Historic
SELECT *
FROM Deposits_restricted_12M_Historic
GO
--SUMATORIO YEAR 2006
Declare @Cantidad20061 decimal (14,2)
Declare @Cantidad20062 decimal (14,2)
Declare @Cantidad20063 decimal (14,2)
Declare @Cantidad20064 decimal (14,2)
Declare @Cantidad20065 decimal (14,2)
Select @Cantidad20061 = Year_2006 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad20062 = Year_2006 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad20063 = Year_2006 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad20064 = Year_2006 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad20065 = Year_2006 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2007
Declare @Cantidad20071 decimal (14,2)
Declare @Cantidad20072 decimal (14,2)
Declare @Cantidad20073 decimal (14,2)
Declare @Cantidad20074 decimal (14,2)
Declare @Cantidad20075 decimal (14,2)
Select @Cantidad20071 = Year_2007 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad20072 = Year_2007 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad20073 = Year_2007 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad20074 = Year_2007 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad20075 = Year_2007 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2008
Declare @Cantidad20081 decimal (14,2)
Declare @Cantidad20082 decimal (14,2)
Declare @Cantidad20083 decimal (14,2)
Declare @Cantidad20084 decimal (14,2)
Declare @Cantidad20085 decimal (14,2)
Select @Cantidad20081 = Year_2008 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad20082 = Year_2008 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad20083 = Year_2008 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad20084 = Year_2008 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad20085 = Year_2008 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2009
Declare @Cantidad20091 decimal (14,2)
Declare @Cantidad20092 decimal (14,2)
Declare @Cantidad20093 decimal (14,2)
Declare @Cantidad20094 decimal (14,2)
Declare @Cantidad20095 decimal (14,2)
Select @Cantidad20091 = Year_2009 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad20092 = Year_2009 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad20093 = Year_2009 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad20094 = Year_2009 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad20095 = Year_2009 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2010
Declare @Cantidad200101 decimal (14,2)
Declare @Cantidad200102 decimal (14,2)
Declare @Cantidad200103 decimal (14,2)
Declare @Cantidad200104 decimal (14,2)
Declare @Cantidad200105 decimal (14,2)
Select @Cantidad200101 = Year_2010 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad200102 = Year_2010 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad200103 = Year_2010 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad200104 = Year_2010 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad200105 = Year_2010 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2011
Declare @Cantidad200111 decimal (14,2)
Declare @Cantidad200112 decimal (14,2)
Declare @Cantidad200113 decimal (14,2)
Declare @Cantidad200114 decimal (14,2)
Declare @Cantidad200115 decimal (14,2)
Select @Cantidad200111 = Year_2011 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad200112 = Year_2011 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad200113 = Year_2011 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad200114 = Year_2011 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad200115 = Year_2011 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2012
Declare @Cantidad200121 decimal (14,2)
Declare @Cantidad200122 decimal (14,2)
Declare @Cantidad200123 decimal (14,2)
Declare @Cantidad200124 decimal (14,2)
Declare @Cantidad200125 decimal (14,2)
Select @Cantidad200121 = Year_2012 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad200122 = Year_2012 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad200123 = Year_2012 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad200124 = Year_2012 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad200125 = Year_2012 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2013
Declare @Cantidad200131 decimal (14,2)
Declare @Cantidad200132 decimal (14,2)
Declare @Cantidad200133 decimal (14,2)
Declare @Cantidad200134 decimal (14,2)
Declare @Cantidad200135 decimal (14,2)
Select @Cantidad200131 = Year_2013 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad200132 = Year_2013 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad200133 = Year_2013 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad200134 = Year_2013 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad200135 = Year_2013 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO YEAR 2014
Declare @Cantidad200141 decimal (14,2)
Declare @Cantidad200142 decimal (14,2)
Declare @Cantidad200143 decimal (14,2)
Declare @Cantidad200144 decimal (14,2)
Declare @Cantidad200145 decimal (14,2)
Select @Cantidad200141 = Year_2014 from Non_current_assets_Historic where IdCuenta ='7'
Select @Cantidad200142 = Year_2014 from Non_current_assets_Historic where IdCuenta ='8'
Select @Cantidad200143 = Year_2014 from Non_current_assets_Historic where IdCuenta ='9'
Select @Cantidad200144 = Year_2014 from Non_current_assets_Historic where IdCuenta ='10'
Select @Cantidad200145 = Year_2014 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2007_2006
Declare @DIF_2007_20061 decimal (14,2)
Declare @DIF_2007_20062 decimal (14,2)
Declare @DIF_2007_20063 decimal (14,2)
Declare @DIF_2007_20064 decimal (14,2)
Declare @DIF_2007_20065 decimal (14,2)
Select @DIF_2007_20061 = Dif_2007_2006 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2007_20062 = Dif_2007_2006 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2007_20063 = Dif_2007_2006 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2007_20064 = Dif_2007_2006 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2007_20065 = Dif_2007_2006 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2008_2007
Declare @DIF_2008_20071 decimal (14,2)
Declare @DIF_2008_20072 decimal (14,2)
Declare @DIF_2008_20073 decimal (14,2)
Declare @DIF_2008_20074 decimal (14,2)
Declare @DIF_2008_20075 decimal (14,2)
Select @DIF_2008_20071 = Dif_2008_2007 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2008_20072 = Dif_2008_2007 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2008_20073 = Dif_2008_2007 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2008_20074 = Dif_2008_2007 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2008_20075 = Dif_2008_2007 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2009_2008
Declare @DIF_2009_20081 decimal (14,2)
Declare @DIF_2009_20082 decimal (14,2)
Declare @DIF_2009_20083 decimal (14,2)
Declare @DIF_2009_20084 decimal (14,2)
Declare @DIF_2009_20085 decimal (14,2)
Select @DIF_2009_20081 = Dif_2009_2008 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2009_20082 = Dif_2009_2008 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2009_20083 = Dif_2009_2008 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2009_20084 = Dif_2009_2008 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2009_20085 = Dif_2009_2008 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2010_2009
Declare @DIF_2010_20091 decimal (14,2)
Declare @DIF_2010_20092 decimal (14,2)
Declare @DIF_2010_20093 decimal (14,2)
Declare @DIF_2010_20094 decimal (14,2)
Declare @DIF_2010_20095 decimal (14,2)
Select @DIF_2010_20091 = Dif_2010_2009 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2010_20092 = Dif_2010_2009 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2010_20093 = Dif_2010_2009 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2010_20094 = Dif_2010_2009 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2010_20095 = Dif_2010_2009 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2011_2010
Declare @DIF_2011_20101 decimal (14,2)
Declare @DIF_2011_20102 decimal (14,2)
Declare @DIF_2011_20103 decimal (14,2)
Declare @DIF_2011_20104 decimal (14,2)
Declare @DIF_2011_20105 decimal (14,2)
Select @DIF_2011_20101 = Dif_2011_2010 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2011_20102 = Dif_2011_2010 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2011_20103 = Dif_2011_2010 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2011_20104 = Dif_2011_2010 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2011_20105 = Dif_2011_2010 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2012_2011
Declare @DIF_2012_20111 decimal (14,2)
Declare @DIF_2012_20112 decimal (14,2)
Declare @DIF_2012_20113 decimal (14,2)
Declare @DIF_2012_20114 decimal (14,2)
Declare @DIF_2012_20115 decimal (14,2)
Select @DIF_2012_20111 = Dif_2012_2011 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2012_20112 = Dif_2012_2011 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2012_20113 = Dif_2012_2011 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2012_20114 = Dif_2012_2011 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2012_20115 = Dif_2012_2011 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2013_2012
Declare @DIF_2013_20121 decimal (14,2)
Declare @DIF_2013_20122 decimal (14,2)
Declare @DIF_2013_20123 decimal (14,2)
Declare @DIF_2013_20124 decimal (14,2)
Declare @DIF_2013_20125 decimal (14,2)
Select @DIF_2013_20121 = Dif_2013_2012 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2013_20122 = Dif_2013_2012 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2013_20123 = Dif_2013_2012 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2013_20124 = Dif_2013_2012 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2013_20125 = Dif_2013_2012 from Non_current_assets_Historic where IdCuenta ='11'
--SUMATORIO DIF_2014_2013
Declare @DIF_2014_20131 decimal (14,2)
Declare @DIF_2014_20132 decimal (14,2)
Declare @DIF_2014_20133 decimal (14,2)
Declare @DIF_2014_20134 decimal (14,2)
Declare @DIF_2014_20135 decimal (14,2)
Select @DIF_2014_20131 = Dif_2014_2013 from Non_current_assets_Historic where IdCuenta ='7'
Select @DIF_2014_20132 = Dif_2014_2013 from Non_current_assets_Historic where IdCuenta ='8'
Select @DIF_2014_20133 = Dif_2014_2013 from Non_current_assets_Historic where IdCuenta ='9'
Select @DIF_2014_20134 = Dif_2014_2013 from Non_current_assets_Historic where IdCuenta ='10'
Select @DIF_2014_20135 = Dif_2014_2013 from Non_current_assets_Historic where IdCuenta ='11'
insert into Non_current_assets_Historic (IdCuenta,NameCuenta,Year_2006 , Year_2007,Year_2008 ,Year_2009 ,Year_2010,Year_2011 ,Year_2012 ,Year_2013 ,
Year_2014,Dif_2007_2006,Dif_2008_2007, Dif_2009_2008, Dif_2010_2009,Dif_2011_2010,Dif_2012_2011,Dif_2013_2012,Dif_2014_2013,
AHP_2007_2006, AHP_2008_2007 , AHP_2009_2008, AHP_2010_2009, AHP_2011_2010, AHP_2012_2011 , AHP_2013_2012,AHP_2014_2013 )
Values (1, 'Non-current assets', NULL,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll),
(19, '', NULL,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll,NUll),
(20, 'Total Non-current assets',
--Year_2006
(@Cantidad20061 +@Cantidad20062 +@Cantidad20063 +@Cantidad20064 +@Cantidad20065),
--Year_2007
(@Cantidad20071+ @Cantidad20072 + @Cantidad20073+ @Cantidad20074+ @Cantidad20075),
--Year_2008
(@Cantidad20081 + @Cantidad20082 + @Cantidad20083 + @Cantidad20084 + @Cantidad20085),
--Year_2009
(@Cantidad20091 + @Cantidad20092 + @Cantidad20093+ @Cantidad20094 + @Cantidad20095),
--Year_2010
(@Cantidad200101 + @Cantidad200102 + @Cantidad200103 + @Cantidad200104 + @Cantidad200105),
--Year_2011
(@Cantidad200111 + @Cantidad200112 + @Cantidad200113 + @Cantidad200114 + @Cantidad200115),
--Year_2012
(@Cantidad200121 + @Cantidad200122 + @Cantidad200123 +@Cantidad200124 + @Cantidad200125),
--Year_2013
(@Cantidad200131 + @Cantidad200132 +@Cantidad200133 + @Cantidad200134 + @Cantidad200135),
--Year_2014
(@Cantidad200141 + @Cantidad200142 + @Cantidad200143 + @Cantidad200144 + @Cantidad200145),
--Diferencia Numeria 2007-2006
(@DIF_2007_20061 + @DIF_2007_20062 + @DIF_2007_20063 + @DIF_2007_20064 + @DIF_2007_20065),
--Diferencia Numeria 2008-2007
(@DIF_2008_20071 + @DIF_2008_20072 + @DIF_2008_20073 + @DIF_2008_20074 + @DIF_2008_20075),
--Diferencia Numeria 2009-2008
(@DIF_2009_20081 + @DIF_2009_20082 + @DIF_2009_20083 + @DIF_2009_20084 + @DIF_2009_20085 ),
--Diferencia Numeria 2010-2009
(@DIF_2010_20091 + @DIF_2010_20092 + @DIF_2010_20093 + @DIF_2010_20094 + @DIF_2010_20095),
--Diferencia Numeria 2011-2006
(@DIF_2011_20101 + @DIF_2011_20102 + @DIF_2011_20103 +@DIF_2011_20104 + @DIF_2011_20105 ),
--Diferencia Numeria 2012-2011
(@DIF_2012_20111+@DIF_2012_20112+@DIF_2012_20113+@DIF_2012_20114+@DIF_2012_20115),
--Diferencia Numeria 2013-2012
(@DIF_2013_20121 + @DIF_2013_20122 +@DIF_2013_20123 +@DIF_2013_20124 + @DIF_2013_20125),
--Diferencia Numeria 2014-2013
(@DIF_2014_20131+@DIF_2014_20132+@DIF_2014_20133+@DIF_2014_20134+@DIF_2014_20135),
--Diferencia Porcentual 2007-2006
(@DIF_2007_20061+@DIF_2007_20062+@DIF_2007_20063+@DIF_2007_20064+@DIF_2007_20065)/(@Cantidad20061 +@Cantidad20062 +@Cantidad20063 +@Cantidad20064 +@Cantidad20065),
--Diferencia Porcentual 2008-2007
(@DIF_2008_20071 + @DIF_2008_20072 + @DIF_2008_20073 + @DIF_2008_20074 + @DIF_2008_20075)/(@Cantidad20071+ @Cantidad20072 + @Cantidad20073+ @Cantidad20074+ @Cantidad20075),
--Diferencia Porcentual 2009-2008
(@DIF_2009_20081 + @DIF_2009_20082 + @DIF_2009_20083 + @DIF_2009_20084 + @DIF_2009_20085 )/(@Cantidad20081 + @Cantidad20082 + @Cantidad20083 + @Cantidad20084 + @Cantidad20085),
--Diferencia Porcentual 2010-2009
(@DIF_2010_20091 + @DIF_2010_20092 + @DIF_2010_20093 + @DIF_2010_20094 + @DIF_2010_20095)/(@Cantidad20091 + @Cantidad20092 + @Cantidad20093+ @Cantidad20094 + @Cantidad20095),
--Diferencia Porcentual 2011-2010
(@DIF_2011_20101 + @DIF_2011_20102 + @DIF_2011_20103 +@DIF_2011_20104 + @DIF_2011_20105)/(@Cantidad200101 + @Cantidad200102 + @Cantidad200103 + @Cantidad200104 + @Cantidad200105),
--Diferencia Porcentual 2012-2011
(@DIF_2012_20111+@DIF_2012_20112+@DIF_2012_20113+@DIF_2012_20114+@DIF_2012_20115)/(@Cantidad200111 + @Cantidad200112 + @Cantidad200113 + @Cantidad200114 + @Cantidad200115),
--Diferencia Porcentual 2013-2012
(@DIF_2013_20121 + @DIF_2013_20122 +@DIF_2013_20123 +@DIF_2013_20124 + @DIF_2013_20125)/(@Cantidad200121 + @Cantidad200122 + @Cantidad200123 +@Cantidad200124 + @Cantidad200125),
--Diferencia Porcentual 2014-2013
(@DIF_2014_20131+@DIF_2014_20132+@DIF_2014_20133+@DIF_2014_20134+@DIF_2014_20135)/(@Cantidad200131 + @Cantidad200132 +@Cantidad200133 + @Cantidad200134 + @Cantidad200135))
GO
SELECT IdCuenta ,NameCuenta ,Year_2006 ,Year_2007 ,Year_2008 ,Year_2009 ,Year_2010 ,Year_2011 ,Year_2012 , Year_2013 ,Year_2014
FROM Non_current_assets_Historic
Maybe the error is in the design
of this table
Regards
Francisco
I work with SQL 2014 Management Studio
Maybe you are looking for
-
Adobe Media Encoder timecode doesn't match Premiere sequence
Hi, I'm having an issue when exporting through Adobe Media Encoder. My source sequence is 23.976 fps; I've set the Display Format in AME to 24 fps (options are 24, 25 and 30) and the export setting to 23.976 but the time codes don't match; AME seems
-
Quicktime 7 is finally fixed in Safari
Thank You Craig Seeman! I have been puzzled by the behavior I noticed in Safari since I upgraded to Tiger (Craig's article can be found here: Craig Seeman, "QuickTime 7 bug (logo w/question mark) - workaround", 01:59pm Jul 4, 2005 CDT ) I followed th
-
Hi All, I need to understand the basic steps required to generate PEXR2002 IDocs from the payment run (transaction F110)? I'm quite confused about how to setup the bank partner and assocated EDI config to generate these IDocs and link them to the pay
-
I want to use my dacqcard 1200 to write digital data, as i ve seen in the documentation, i need to supply external signal for handshaking. i d like to know which line from port C, i should connect to this external signal for handshaking. thanks
-
How to dynamically choose which view to display in second step
Hi Guys, I am just started to evaluate fpm to decide whether to use it in our project. So i have question that might be very simple to you all, that i hope you all can help me with. Say i want to use the fpm GAF with a single webdynpro component. I h