Compare and insert in single SQL
Hi ,
I want to insert data from one source table to destination table based on the condition that a row does not exist in destination table based on some column filter . Is it achievable using single SQL.?
Merge statement cannot be used as there is no unique key in source table and destination table.
Here is the data and tables.
create table test_source
x_id number,
report_sent_flag varchar2(30),
year number
insert into test_source
select 10 , 'Y' , 2013 from dual
union all
select 20 , 'Y' , 2013 from dual
union all
select 30 , 'Y' , 2013 from dual
union all
select 10 , 'Y' , 2013 from dual
union all
select 20 , 'Y' , 2013 from dual
union all
select 30 , 'Y' , 2013 from dual;
create table test_dest
x_id number,
report_sent_flag varchar2(30),
year number
insert into test_dest
select 10 , 'Y' , 2013 from dual
union all
select 10 , 'Y' , 2013 from dual;
select * from test_source ;
10 Y 2013
10 Y 2013
20 Y 2013
20 Y 2013
30 Y 2013
30 Y 2013
select * from test_dest ;
10 Y 2013
10 Y 2013
Now , i need compare test_source and test_dest tables on column (x_id , report_sent_flag and year ) column , since 10 , Y , 2013 is present in test_dest , it will be skipped . Rest should get inserted into test_dest table from test_source.
Can it done by SQL only ?
Please assist
Thanks
Please assist
only SQL :
SQL> select * from test_source ;
X_ID REPORT_SENT_FLAG YEAR
10 Y 2013
20 Y 2013
30 Y 2013
10 Y 2013
20 Y 2013
30 Y 2013
6 rows selected
SQL> select * from test_dest ;
X_ID REPORT_SENT_FLAG YEAR
10 Y 2013
10 Y 2013
SQL> select * from log_exists_records;
X_ID REPORT_SENT_FLAG YEAR
SQL>
SQL> insert into log_exists_records
2 select * from test_source
3 where (x_id,report_sent_flag,year ) in (select x_id,report_sent_flag,year from test_dest)
4 /
2 rows inserted
SQL>
SQL> insert into test_dest
2 select * from test_source
3 where (x_id,report_sent_flag,year ) not in (select x_id,report_sent_flag,year from test_dest)
4 /
4 rows inserted
SQL> select * from test_source ;
X_ID REPORT_SENT_FLAG YEAR
10 Y 2013
20 Y 2013
30 Y 2013
10 Y 2013
20 Y 2013
30 Y 2013
6 rows selected
SQL> select * from test_dest ;
X_ID REPORT_SENT_FLAG YEAR
20 Y 2013
30 Y 2013
20 Y 2013
30 Y 2013
10 Y 2013
10 Y 2013
6 rows selected
SQL> select * from log_exists_records;
X_ID REPORT_SENT_FLAG YEAR
10 Y 2013
10 Y 2013
SQL>
Ramin Hashimzade
Similar Messages
-
Hello everyone,
To move data from roughly 50 CSV files in a loop to SQL tables (table per CSV file), I've used BulkInsert task in For each loop container.
Please know, for all different columns of all CSV files, the filed length was specified as varchar(255).
It worked well for the first 6 files but on 7th file, it has found the data in one of the columns of CSV file, which exceeds the limit of 255 characters. I would like to truncate the data and insert the remaining data for this column. In other words, I would
like to insert first 255 characters into table for this field and let the package execute further.
Also, if I would use SQLBulkCopy in Script task, will it be able to resolve this problem? I believe, I would face the similar problem over there too. Although, would like to get confirmation for the same from experts.
Can you please advise how to get rid of this truncation error?
Any help would be greatly appreciated.
Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.comHello! I suggest you add a derived column transformation between Source and Destination, use string functions to extract first 255 characters from the incoming field and send this output to destination field, that ways you can get rid of these sort of issues.
Useful Links:
http://sqlage.blogspot.in/2013/07/ssis-how-to-use-derived-column.html
Good Luck!
Please Mark This As Answer if it solved your issue.
Please Vote This As Helpful if it helps to solve your issue -
Loop through nested elements and insert via PL/SQL
INSERT INTO Orders(id, OrderXML) VALUES
(S_Orders.Nextval,
'<?xml version="1.0" encoding="utf-8" ?>
<Order xmlns="urn:foo" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:foo foo-1.xsd">
<OrderRef>BBB</OrderRef>
<OrderDate>2005-03-29</OrderDate>
<CustomerID>1051</CustomerID>
<Items>
<Item>
<ProductID>7</ProductID>
<Price>45.6</Price>
<Quantity>2</Quantity>
</Item>
<Item>
<ProductID>19</ProductID>
<Price>73.5</Price>
<Quantity>10</Quantity>
</Item>
</Items>
</Order>'
I some questions regarding index search of nested elements like,Items in the above example.
I would like to know how I can LOOP through the Items like 1..2 LOOP
and insert those elements(item) into one table. Order information should go in to another table.
Can this be done with xpath and PL/SQL.
Regards
UlfHi Marco!
Here's some more information:
CREATE TABLE ITEM (ProductID NUMBER,
Price NUMBER(8,2)
Quantity 10 NUMBER);
CREATE TABLE ORDER (OrderRef VARCHAR2(10),
ORDER_DATE VARCHAR2(10),
CUSTOMERID NUMBER);
The main problem that I have is to create an solution that can be dynamic so that I can have for instance one order and four items in one XML.
The second XML can have one order and 10 items.
First I want to insert the order elements in the order table and then the items records in the item table.
To complicate things futher my real XML have namespaces in the XML, but this I think I can handle.
Summary: So for each order row(element) I want to traverse the Item elements and insert them to the Item table.
Regards
/Ulf -
SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table
Hello everyone,
I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
in it. These files needs to be migrated to SQL tables using the optimum way.
Does anybody know which is the best way to setup the Dataflow task for this requirement?
Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
Any help would be much appreciated. It's very urgent.
Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.comThe standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
Plus. No programming skills are required.
SSIS Tasks Components Scripts Services | http://www.cozyroc.com/ -
How can I read file from filepath and Insert in to SQL Server ?
Hi,
I have table called "table1" and has data like this ..
id
Folder
FileName
1
f:\cfs\O\
ENDTINS5090.tif
2
D:\Grant\
CLMDOC.doc
3
f:\cfs\Team\
CORRES_3526.msg
4
f:\cfs\S\
OTH_001.PDF
I have another table called "table2" with content column is image datatype.
Id
FileName
Content
1
FileName
2
ENDTINS5090.tif
3
CLMDOC.doc
4
CORRES_3526.msg
5
OTH_001.PDF
I would like to insert in to content column in the table2 by reading the file from filepath( table1).
Is there any simple way or SSIS able to do it ?
Please help me on this.
Thank you.http://dimantdatabasesolutions.blogspot.co.il/2009/05/how-to-uploadmodify-more-than-one-blob.html
Best Regards,Uri Dimant SQL Server MVP,
http://sqlblog.com/blogs/uri_dimant/
MS SQL optimization: MS SQL Development and Optimization
MS SQL Consulting:
Large scale of database and data cleansing
Remote DBA Services:
Improves MS SQL Database Performance
SQL Server Integration Services:
Business Intelligence -
SQL Loader and Insert Into Performance Difference
Hello All,
Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
Thanks,
Kannan.Kannan B wrote:
Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative. -
Compare String in a table and insert the common values into a New table
Hi all,
Anyone has idea on how to compare a string value in a table.
I have a Students Table with Student_id and Student_Subject_list columns as below.
create table Students( Student_id number,
Student_Subject_list varchar2(2000)
INSERT INTO Students VALUES (1,'Math,Science,Arts,Music,Computers,Law,Business,Social,Language arts,History');
INSERT INTO Students VALUES (2,'Math,Law,Business,Social,Language arts,History,Biotechnology,communication');
INSERT INTO Students VALUES (3,'History,Spanish,French,Langage arts');
INSERT INTO Students VALUES (4,'History,Maths,Science,Chemistry,English,Reading');
INSERT INTO Students VALUES (5,'Math,Science,Arts,Music,Computer Programming,Language arts,History');
INSERT INTO Students VALUES (6,'Finance,Stocks');
output
Student_id Student_Subject_list
1 Math,Science,Arts,Music,Computers,Law,Business,Social,Language arts,History
2 Math,Law,Business,Social,Language arts,History,Biotechnology,communication
3 History,Spanish,French,Langage arts
4 History,Maths,Science,Chemistry,English,Reading
5 Math,Science,Arts,Music,Computer Programming,Language arts,History
6 Finance,Stocks
I need help or some suggestion in write a query which can compare each row string value of Student_Subject_list columns and insert the
common subjects into a new table(Matched_Subjects).The second table should have the below colums and data.
create table Matched_Subjects(Student_id number,
Matching_studesnt_id Number,
Matched_Student_Subject varchar2(2000)
INSERT INTO Matched_Subjects VALUES (1,2,'Math,Law,Business,Social,Language arts,History');
INSERT INTO Matched_Subjects VALUES (1,3,'History,Langage arts');
INSERT INTO Matched_Subjects VALUES (1,4,'History,Maths,Science');
INSERT INTO Matched_Subjects VALUES (1,5,'Math,Science,Arts,Music,Language arts,History');
INSERT INTO Matched_Subjects VALUES (2,3,'History,Langage arts');
INSERT INTO Matched_Subjects VALUES (2,4,'History,Maths');
INSERT INTO Matched_Subjects VALUES (2,5,'Math,Language arts,History');
INSERT INTO Matched_Subjects VALUES (3,4,'History');
INSERT INTO Matched_Subjects VALUES (3,5,'Language arts,History');
INSERT INTO Matched_Subjects VALUES (4,5,'Math,Science');
output:
Student_id Match_Student_id Matched_Student_Subject
1 2 Math,Law,Business,Social,Language arts,History
1 3 History,Langage arts
1 4 History,Maths,Science
1 5 Math,Science,Arts,Music,Language arts,History
2 3 History,Langage arts
2 4 History,Maths
2 5 Math,Language arts,History
3 4 History
3 5 Language arts,History
4 5 Math,Science
any help will be appreciated.
Thanks.
Edited by: user7988 on Sep 25, 2011 8:45 AMuser7988 wrote:
Is there an alternate approach to this without using xmlagg/xmlelement What Oracle version are you using? In 11.2 you can use LISTAGG:
insert
into Matched_Subjects
with t as (
select student_id,
column_value l,
regexp_substr(student_subject_list,'[^,]+',1,column_value) subject
from students,
table(
cast(
multiset(
select level
from dual
connect by level <= length(regexp_replace(student_subject_list || ',','[^,]'))
as sys.OdciNumberList
select t1.student_id,
t2.student_id,
listagg(t1.subject,',') within group(order by t1.l)
from t t1,
t t2
where t1.student_id < t2.student_id
and t1.subject = t2.subject
group by t1.student_id,
t2.student_id
STUDENT_ID MATCHING_STUDESNT_ID MATCHED_STUDENT_SUBJECT
1 2 Math,Law,Business,Social,Language arts,History
1 3 Language arts,History
1 4 Science,History
1 5 Math,Science,Arts,Music,Language arts,History
2 3 Language arts,History
2 4 History
2 5 Math,Language arts,History
3 4 History
3 5 History,Language arts
4 5 History,Science
10 rows selected.
SQL> Prior to 11.2 you can create your own string aggregation function STRAGG - there are plenty of example on this forum. Or use hierarchical query:
insert
into Matched_Subjects
with t1 as (
select student_id,
column_value l,
regexp_substr(student_subject_list,'[^,]+',1,column_value) subject
from students,
table(
cast(
multiset(
select level
from dual
connect by level <= length(regexp_replace(student_subject_list || ',','[^,]'))
as sys.OdciNumberList
t2 as (
select t1.student_id student_id1,
t2.student_id student_id2,
t1.subject,
row_number() over(partition by t1.student_id,t2.student_id order by t1.l) rn
from t1,
t1 t2
where t1.student_id < t2.student_id
and t1.subject = t2.subject
select student_id1,
student_id2,
ltrim(sys_connect_by_path(subject,','),',') MATCHED_STUDENT_SUBJECT
from t2
where connect_by_isleaf = 1
start with rn = 1
connect by student_id1 = prior student_id1
and student_id2 = prior student_id2
and rn = prior rn + 1
STUDENT_ID MATCHING_STUDESNT_ID MATCHED_STUDENT_SUBJECT
1 2 Math,Law,Business,Social,Language arts,History
1 3 Language arts,History
1 4 Science,History
1 5 Math,Science,Arts,Music,Language arts,History
2 3 Language arts,History
2 4 History
2 5 Math,Language arts,History
3 4 History
3 5 History,Language arts
4 5 History,Science
10 rows selected.SY. -
Multiple SQLs INSERT in a single SQL with O.Lite on PDA
Hi,
We are using(and new to) Oracle on PDA, dvlping in JAVA. We need to increase performance and reliability to make multiple INSERT in a single SQL statement, dynamically created :
We've got a syntax error when executing this :
INSERT INTO t1 (row1,row2) VALUES ('x','y');
INSERT INTO t1 (row1,row2) VALUES ('a','v');
INSERT INTO t1 (row1,row2) VALUES ('e','r');
etc... in a single execSql
Any suggests would be helpfull !
JMarc
[email protected]Hi Praveen
If your use case is like having large no. of data rows and inserting those into DB. I believe best appropriate way would be form a xml and then pass it to DB. While in DB, you can create SP and perform your logical steps(if any) thereafter inserting data into table.
The above link shared by Muzammil talks on the same subject.
While once within SP(DB layer), you can fetch entire xml using below example:
DECLARE @data XML;
-- Element-centered XML
SET @data = '<data>
<customer>
<id>1</id>
<name>Name 1 </name>
</customer>
<customer>
<id>2</id>
<name>Name 2</name>
</customer>
<customer>
<id>3</id>
<name>Name 3</name>
</customer>
</data>';
SELECT T.customer.value('(id)[1]', 'INT') AS customer_id,
T.customer.value('(name)[1]', 'VARCHAR(20)') AS customer_name
FROM @data.nodes('data/customer') AS T(customer);
The above run will give you output from xml in single shot.
You can also find maximum no. of rows as below.
declare @max int
select @max = @data.value('fn:count(/data/customer/id)','int')
select @max
I believe above should help you around with your insertion... -
Compare 2 tables and insert rows missing in each table.
I have a tough situation. I have two exact tables- one online and one offline database. I know that there are missing rows of data from each table, so I need to make a comparison of one table and insert rows that do not exist. I was thinking to try this, but it took over 3 hours to run and did not return anything:
insert into t
select * from t a
where not exists (select * from [email protected] b
where a.col1 != b.col1
and a.col2 != b.col2
and a.col3 != b.col3);
and it goes on for another 7columns.
The trouble I have is to include a date clause so that the query can be broken down into running only a few months of data comparisions at a time- the records go back 4 years to compare. Also is there a way to write this so that it will query both tables at the same time in order to speed things up- or is one table at a time the best advice? Each table has over 100 million records to compare, that's why I was hoping to use a date criteria since one column is date.
Let me know what you advise to make this work, I hope I was on the right track.Not sure if the MINUS operator will perform better with your data set but;
SQL> create table t1 (some_id number, some_date date)
Table created.
SQL> create table t2 (some_id number, some_date date)
Table created.
SQL> insert into t1 values (1, trunc(sysdate))
1 row created.
SQL> insert into t1 values (2, trunc(sysdate-5))
1 row created.
SQL> insert into t1 values (4, trunc(sysdate-90))
1 row created.
SQL> insert into t2 values (1, trunc(sysdate))
1 row created.
SQL> insert into t2 values (3, trunc(sysdate-10))
1 row created.
SQL> insert into t2 values (5, trunc(sysdate-100))
1 row created.
SQL> select * from t1
SOME_ID SOME_DAT
1 07-07-30
2 07-07-25
4 07-05-01
3 rows selected.
SQL> select * from t2
SOME_ID SOME_DAT
1 07-07-30
3 07-07-20
5 07-04-21
3 rows selected.
SQL> insert into t1 (
select some_id, some_date from t2 where some_date between sysdate-50 and sysdate
minus
select some_id, some_date from t1 where some_date between sysdate-50 and sysdate)
1 row created.
SQL> insert into t2 (
select some_id, some_date from t1 where some_date between sysdate-50 and sysdate
minus
select some_id, some_date from t2 where some_date between sysdate-50 and sysdate)
1 row created.
SQL> select * from t1
SOME_ID SOME_DAT
1 07-07-30
2 07-07-25
4 07-05-01
3 07-07-20
4 rows selected.
SQL> select * from t2
SOME_ID SOME_DAT
1 07-07-30
3 07-07-20
5 07-04-21
2 07-07-25
4 rows selected. -
In any release of SD, if I have a single worksheet containing 10 sqls. Is it possible to place the cursor on any of the sql and run only that sql, yet, append its output to existing output window. I can then select another sql and execute it and keep appending output. In other words, do not clear existing output or start a new output tab.
As it exists today (in any release), I can either 'run script' which does append, but it executes all the sql (non-selective). Alternately, I can 'run statement' to selectively run a single sql, but it will clear the output window (or if pinned), start a new one. None of this is what I want.
Thank you.Select the query you want to run. Execute it via F5. Then highlight the next query and repeat.
The output will append to the Script Output panel.
There's no way to get 2 queries to share a grid, unless you were to run them as a single query a la UNION. -
The attached file is work inprogress, with some dummy data sp that I can test it out without having to connect to equipment.
The second tab is the one that I am having the problem with. the output array from the replace element appears to be starting at the index position of 1 rather than 0 but that is ok it is still show that the new data is placed in incrementing element locations. However the main array that I am trying to build that is suppose to take each new calculation and place it in the next index(row) does not ap
pear to be working or at least I am not getting any indication on the inidcator.
Basically what I am attempting to do is is gather some pulses from adevice for a minute, place the results for a calculation, so that it displays then do the same again the next minute, but put these result in the next row and so on until the specifiied time has expired and the loop exits. I need to have all results displayed and keep building the array(display until, the end of the test)Eventually I will have to include a min max section that displays the min and max values calculated, but that should be easy with the min max function.Actually I thought this should have been easy but, I gues I can not see the forest through the trees. Can any one help to slear this up for me.
Attachments:
regulation_tester_7_loops.vi 244 KBI didn't really have time to dig in and understand your program in depth,
but I have a few tips for you that might things a bit easier:
- You use local variables excessively which really complicates things. Try
not to use them and it will make your life easier.
- If you flowchart the design (very similar to a dataflow diagram, keep in
mind!) you want to gather data, calculate a value from that data, store the
calculation in an array, and loop while the time is in a certain range. So
theres really not much need for a sequence as long as you get rid of the
local variables (sequences also complicate things)
- You loop again if timepassed+1 is still less than some constant. Rather
than messing with locals it seems so much easier to use a shiftregister (if
absolutely necessary) or in this case base it upon the number of iterations
of the loop. In this case it looks like "time passed" is the same thing as
the number of loop iterations, but I didn't check closely. There's an i
terminal in your whileloop to read for the number of iterations.
- After having simplified your design by eliminating unnecessary sequence
and local variables, you should be able to draw out the labview diagram.
Don't try to use the "insert into array" vis since theres no need. Each
iteration of your loop calculates a number which goes into the next position
of the array right? Pass your result outside the loop, and enable indexing
on the terminal so Labview automatically generates the array for you. If
your calculation is a function of previous data, then use a shift register
to keep previous values around.
I wish you luck. Post again if you have any questions. Without a more
detailed understanding of your task at hand it's kind of hard to post actual
code suggestions for you.
-joey
"nelsons" wrote in message
news:[email protected]...
> how do I create a 1d array that takes a single calculation and insert
> the result into the first row and then the next calculation the next
> time the loop passes that point and puts the results in thsecond row
> and so on until the loop is exited.
>
> The attached file is work inprogress, with some dummy data sp that I
> can test it out without having to connect to equipment.
> The second tab is the one that I am having the problem with. the
> output array from the replace element appears to be starting at the
> index position of 1 rather than 0 but that is ok it is still show that
> the new data is placed in incrementing element locations. However the
> main array that I am trying to build that is suppose to take each new
> calculation and place it in the next index(row) does not appear to be
> working or at least I am not getting any indication on the inidcator.
>
> Basically what I am attempting to do is is gather some pulses from
> adevice for a minute, place the results for a calculation, so that it
> displays then do the same again the next minute, but put these result
> in the next row and so on until the specifiied time has expired and
> the loop exits. I need to have all results displayed and keep building
> the array(display until, the end of the test)Eventually I will have to
> include a min max section that displays the min and max values
> calculated, but that should be easy with the min max function.Actually
> I thought this should have been easy but, I gues I can not see the
> forest through the trees. Can any one help to slear this up for me. -
How to Select from Oracle 8i database and insert into Sql Server 2005 datab
Hi how to Select from Oracle 8i and insert into Sql Server 2005.
Source db as Oracle 8i
Target db as Sql Server 2005.
I need to select one table data from Oracle 8i & insert into Sql Server 2005
ThanksThanks Khan..
Is there is any query (OPENQUERY) available for that?
Regards.. -
SQL SERVER BULK FETCH AND INSERT/UPDATE?
Hi All,
I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some business logic?
How do i do this?
Thanks in Advance.
Regards
Yogesh.B> is there a possibility that I can do a bulk fetch and place it in an array, even inside a stored procedure ?
You can use Temporary tables or Table variables and have them indexes as well
>After I have processed my records, tell me a way that I will NOT go, RECORD by RECORD basis, even inside a stored procedure ?
As i said earlier, you can perform UPDATE these temporary tables or table variables and finally INSERT/ UPDATE your base table
>Arrays are used just to minimize the traffic between the server and the program area. They are used for efficient processing.
In your case you will first have to populate the array (Using some of your queries from the server) which means you will first load the arrary, do some updates, and then send them back to server therefore
network engagement
So I just gave you some thoughts I feel could be useful for your implementation, like we say, there are many ways so pick the one that works good for you in the long run with good scalability
Good Luck! Please Mark This As Answer if it solved your issue. Please Vote This As Helpful if it helps to solve your issue -
How to create and insert muliple PDF pages to a single file
Greeting to everyone.
I use Acrobat 8 Standard to create PDF files, usually by PRINTING TO PDF from a CAD file(Vectorworks) or web pages on the net.
I know how to create single pages and I know how to combine them into a single PDF file by using the Document - Insert Pages command.
My question is: When creating a PDF file(e.g from my CAD program), can I print and insert instantly, rather than going through 2 steps.
Thank you in advance for your suggestions.
SidugNo. Future questions concerning Acrobat belongs in the Acrobat forum.
This forum is for dealing with Reader issues. Users of free Reader
aren't that familiar with the commercial Acrobat Standard, Pro or
Extended programs.
Mike -
Connecting to datasource and retrieve, insert and update data in SQL Server
hi,
i am trying to retrieve, insert and update data from SQL Server 2000 and display in JSPDynPage and is for Portal Application. I have already created datasource in visual composer. Is there any sample codes for mi to use it as reference???
Thanks
Regards,
shixuanHi,
See this link
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/6209b52e-0401-0010-6a9f-d40ec3a09424
Regards,
Senthil kumar K.
Maybe you are looking for
-
I hope this is the right forum to post. I was wondering if there are any good tutorials or books on SQL to XML? Basically I need to query accounts in the state of North Carolina that have recieved a certain document in the past day. That is a very ea
-
I'm moving from the US to India, will Apple TV and the movies I purchased on the apple site work with a PAL TV?
-
J6480 LCD Control Panel Display Malfunction
My All-In-One J6480 Officejet is on its last legs. In 2010, I added a Win 7 based laptop to my wireless home network shared with a Vista based 32 bit OS. That created some lasting conflicts that still result in not being able to scan from the Win 7
-
After to install ios 6 on my iphone 4, I have missed my photos and videos in the albums. The system says that there aren´t any photo in the album, but in the memory bar on intunes appears a yellow bar where I suppose that are the photos and videos. H
-
Is there any software that will normalize sound volume?
I don't want to use Sound Check to normalize my whole library, and my library is too large to modify the volume on a song by song basis. What I need to find is software that can normalize the volume of a playlist, both while playing it on my Mac, and