On submit perform an insert on one table and an update on aother table
I am trying to perform and insert on the table one table (the wizard created my form the insert is going against the table that I created using the wizard) and on the form is on field that is also in another table. Therefore, I am trying to perform an update on one attribute of one table and a insert into another table. How do I do this in apex?
If you have used wizard to create form, then you may see a process of type 'Automatic Row Processing (DML)' in your page which will perform INSERT/UPDATE/DELETE on your form table. Here you can see APEX performs INSERT only when REQUEST is in 'INSERT, CREATE, CREATE_AGAIN, CREATEAGAIN'
So create one more PL/SQL page process which will execute at 'on Submit after validations' and write update process as follows
begin
-- pseudo table/columns
update tbl_second
set col1 = :p1_item
where pk_col = :p1_pk_item;
end;Make this process conditional so that it will perform UPDATE only when request value is in 'INSERT, CREATE, CREATE_AGAIN, CREATEAGAIN' ( i.e. only when you are inserting into your form table)
Cheers,
Hari
p.s. I think you may also need to update the second table when some-one updates your form table.
Edited by: Hari_639 on Oct 26, 2009 9:46 AM
Similar Messages
-
The type of the database table and work area (or internal table)...
Hello
I am trying to use a database and select all records from it and store them into an internal table.
My code:
Select * from xixi_dbcurrency into table gt_currency.
The error:
"The type of the database table and work area (or internal table) "GT_CURRENCY" are not Unicode-convertible . . . . . . . . . . "
Any suggestions?
Thank youHi Thomas,
Thank you for your inputs above.
But as you suggested is we use INTO CORRESPONDING FIELDS OF TABLE then it resolve the error.
But I have below piece of code:
DATA: it_new_source TYPE STANDARD TABLE OF _ty_s_sc_1,
wa_source TYPE _ty_s_sc_1,
wa_new_source TYPE _ty_s_sc_1,
ls_target_key TYPE t_target_key.
SELECT * INTO CORRESPONDING FIELDS OF TABLE it_new_source
FROM /bic/afao06pa100
FOR ALL ENTRIES IN SOURCE_PACKAGE
where /bic/fcckjobno = SOURCE_PACKAGE-/bic/fcckjobno
and /bic/fcckjitid = SOURCE_PACKAGE-/bic/fcckjitid.
But since this is reading into corresponding fields of table the data load from one DSO to other DOS is running for long more that 15 hours and still not getting completed and giving dump.
So if I switch the search to below:
SELECT * FROM /bic/afao06pa100
INTO TABLE it_new_source
FOR ALL ENTRIES IN SOURCE_PACKAGE
where /bic/fcckjobno = SOURCE_PACKAGE-/bic/fcckjobno
and /bic/fcckjitid = SOURCE_PACKAGE-/bic/fcckjitid.
Then I am getting below error:E:The type of the database table and work area (or internal table) "IT_NEW_SOURCE" are not Unicode convertible.
Can you please advice on this, as performance need to improve in start routine code.
Thank You. -
Import data from few tables and export into the same tables on different db
I want to import data from few tables and export into the same tables on different database. But on the target database, additional columns have been added
to the same tables. how can i do the import?
Its urgent can anyone please help me do this?
Thanks.Hello Junior DBA,
maybe try it with the "copy command".
http://download.oracle.com/docs/cd/B14117_01/server.101/b12170/apb.htm
Have a look at the section "Understanding COPY Command Syntax".
Here is an example of a COPY command that copies only two columns from the source table, and copies only those rows in which the value of DEPARTMENT_ID is 30:Regards
Stefan -
I need to know the name of the database table and the fields in that table
hi,
i need to I need to know the name of the database table and the fields in that table for the following fields of the front end .
1) incident details.
2) ownership details
3) injury type
4) % of investigation completed withen 7 days.
5) count of incident type
6) cost of workers compensation claim.
7) injury resulting from for workers compensation claim
8) investigation free text.
9) investigation contribution factors.
10) investigation root cause.
11) investigation root cause free text
12) employee risk assesment
13) protential infrigment notice issued
14) actual infrigment notice issued.
15) actual infrigment notice reference number.
16)vehicle damaged text.
18) when the incident occured.
thanks and regards,
pronoy .Hello,
Check CCIHT* under se16 and search for relevant information
Thanks
Jayakumar -
How to insert some records in one table and some records in another table
Interview question
how insert records in two tables by using trigger
CREATE or REPLACE TRIGGER Emp_Ins_Upd_Del_Trig
BEFORE delete or insert or update on EMP
FOR EACH ROW
BEGIN
if UPDATING then
UPDATE emp2
SET
empno = :new.empno,
ename = :new.ename
--, job = :new.job
--, mgr = :new.mgr
--, hiredate = :new.hiredate
, sal = :new.sal
--, comm = :new.comm
--, deptno = :new.deptno;
sdate = :new.sdate,
edate = :new.edate
end if;
if INSERTING then
INSERT INTO emp2
VALUES
( :new.empno
, :new.ename
--, :new.job
--, :new.mgr
--, :new.hiredate
, :new.sal
--, :new.comm
--, :new.deptno
new.sdate,
new.edate);
end if;
if DELETING then
DELETE FROM emp2
WHERE empno = emp2.empno;
end if;
END;
it is working fine but he wants to insert some specific litimit on one table and some specified limit of records in one ..
In this senerio can i insert records by use count of records...
please help me..Can you be more specific on the "Limit"
Conditional insert can be used in this case. -
Unable to descripe the table and unable to drop the table
Hi,
I have a temp table that we use like staging table to import the data in to the main table through some scheduled procedures.And that will dropped every day and will be created through the script.
Some how while I am trying to drop the table manually got hanged, There after I could not find that table in dba_objects, dba_tables or any where.
But Now I am unable to create that table manually(Keep on running the create command with out giving any error), Even I am not getting any error (keep on running )if I give drop/desc of table.
Can you please any one help on this ? Is it some where got stored the table in DB or do we any option to repair the table ?
SQL> select OWNER,OBJECT_NAME,OBJECT_TYPE,STATUS from dba_objects where OBJECT_NAME like 'TEMP%';
no rows selected
SQL> desc temp
Thank in advance.Hi,
if this table drops then it moved DBA_RECYCLEBIN table. and also original name of its changed automatically by oracle.
For example :
SQL> create table tst (col varchar2(10), row_chng_dt date);
Table created.
SQL> insert into tst values ('Version1', sysdate);
1 row created.
SQL> select * from tst ;
COL ROW_CHNG
Version1 16:10:03
If the RECYCLEBIN initialization parameter is set to ON (the default in 10g), then dropping this table will place it in the recyclebin:
SQL> drop table tst;
Table dropped.
SQL> select object_name, original_name, type, can_undrop as "UND", can_purge as "PUR", droptime
2 from recyclebin
SQL> /
OBJECT_NAME ORIGINAL_NAME TYPE UND PUR DROPTIME
BIN$HGnc55/7rRPgQPeM/qQoRw==$0 TST TABLE YES YES 2013-10-08:16:10:12
All that happened to the table when we dropped it was that it got renamed. The table data is still there and can be queried just like a normal table:
SQL> alter session set nls_date_format='HH24:MI:SS' ;
Session altered.
SQL> select * from "BIN$HGnc55/7rRPgQPeM/qQoRw==$0" ;
COL ROW_CHNG
Version1 16:10:03
Since the table data is still there, it's very easy to "undrop" the table. This operation is known as a "flashback drop". The command is FLASHBACK TABLE... TO BEFORE DROP, and it simply renames the BIN$... table to its original name:
SQL> flashback table tst to before drop;
Flashback complete.
SQL> select * from tst ;
COL ROW_CHNG
Version1 16:10:03
SQL> select * from recyclebin ;
no rows selected
It's important to know that after you've dropped a table, it has only been renamed; the table segments are still sitting there in your tablespace, unchanged, taking up space. This space still counts against your user tablespace quotas, as well as filling up the tablespace. It will not be reclaimed until you get the table out of the recyclebin. You can remove an object from the recyclebin by restoring it, or by purging it from the recyclebin.
SQL> select object_name, original_name, type, can_undrop as "UND", can_purge as "PUR", droptime
2 from recyclebin
SQL> /
OBJECT_NAME ORIGINAL_NAME TYPE UND PUR DROPTIME
BIN$HGnc55/7rRPgQPeM/qQoRw==$0 TST TABLE YES YES 2006-09-01:16:10:12
SQL> purge table "BIN$HGnc55/7rRPgQPeM/qQoRw==$0" ;
Table purged.
SQL> select * from recyclebin ;
no rows selected
Thank you
And check this link:
http://www.orafaq.com/node/968
http://docs.oracle.com/cd/B28359_01/server.111/b28310/tables011.htm
Thank you -
Join two source tables and replicat into a target table with BLOB
Hi,
I am working on an integration to source transaction data from legacy application to ESB using GG.
What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
What alternatives are there and what GG recommend in such use case?
Any helpful advice is much appreciated.
thanks,
XiaocunXiaocun,
Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
colmap (usedefaults,
my_blob = @STRCAT (col02, ",", col03, ",", col04)
Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
Some parameters you may want to become familiar with if not already:
COLS | COLSEXCEPT
COLMAP
OVERRIDEDUPS
INSERTDELETES
INSERTMISSINGUPDATES
INSERTUPDATES
GETDELETES | IGNOREDELETES
GETINSERTS | IGNOREINSERTS
GETUPDATES | IGNOREUPDATES
Good luck,
-joe -
Temp tables and deferred updates
Does anyone know why the following update to #test and #test1 is deferred, but the same update to the permanent table inputtable is direct?
I haven't found any documentation that would explain this.
@@version is Adaptive Server Enterprise/15.7.0/EBF 22305 SMP SP61 /P/Sun_svr4/OS 5.10/ase157sp6x/3341/64-bit/FBO/Fri Feb 21 11:55:38 2014
create proc proctest
as
begin
-- inputtable.fiId is int not null
-- Why this is a deferred update?
select fiId into #test from inputtable
update #test set fiId = 0
-- Why this is a deferred update?
create table #test1(fiId int not null)
insert #test1 select fiId from inputtable
update #test1 set fiId = 0
-- Yay. This is a direct update.
update inputtable set fiId = 0
end
go
set showplan on
go
exec proctest
go
|ROOT:EMIT Operator (VA = 2)
|
| |UPDATE Operator (VA = 1)
| | The update mode is deferred.
| |
| | |SCAN Operator (VA = 0)
| | | FROM TABLE
| | | #test
| | | Table Scan.
| | | Forward Scan.
| | | Positioning at start of table.
| | | Using I/O Size 16 Kbytes for data pages.
| | | With LRU Buffer Replacement Strategy for data pages.
| |
| | TO TABLE
| | #test
| | Using I/O Size 2 Kbytes for data pages.
|ROOT:EMIT Operator (VA = 2)
|
| |UPDATE Operator (VA = 1)
| | The update mode is deferred.
| |
| | |SCAN Operator (VA = 0)
| | | FROM TABLE
| | | #test1
| | | Table Scan.
| | | Forward Scan.
| | | Positioning at start of table.
| | | Using I/O Size 16 Kbytes for data pages.
| | | With LRU Buffer Replacement Strategy for data pages.
| |
| | TO TABLE
| | #test1
| | Using I/O Size 2 Kbytes for data pages.
|ROOT:EMIT Operator (VA = 2)
|
| |UPDATE Operator (VA = 1)
| | The update mode is direct.
| |
| | |SCAN Operator (VA = 0)
| | | FROM TABLE
| | | inputtable
| | | Table Scan.
| | | Forward Scan.
| | | Positioning at start of table.
| | | Using I/O Size 16 Kbytes for data pages.
| | | With LRU Buffer Replacement Strategy for data pages.
| |
| | TO TABLE
| | inputtable
| | Using I/O Size 2 Kbytes for data pages.I don't have a documentation reference but the optimizer appears to default to deferred mode when the #table and follow-on DML operation are in the same batch (ie, optimizer makes a 'safe' guess during optimization based on limited details of #table schema).
You can get the queries to operate in direct mode by forcing the optimizer to (re)compile the UPDATEs after the #tables have been created, eg:
- create #table outside of proc; during proc creation/execution the #tables already exist so optimizer can choose direct mode
- perform UPDATEs within exec() construct; exec() calls are processed within a separate/subordinate context, ie, #table is know at time exec() call is compiled so direct mode can be chosen; obvious downside is the overhead for the exec() call and associated compilation phase ... which may be an improvement over a) executing UPDATE in deferred mode and/or b) recompiling the proc (see next bullet), ymmv
- induce a schema change to the #table so proc is recompiled (with #table details known during the recompile) thus allowing use of direct mode; while adding/dropping indexes/constraints/columns will suffice these also add extra processing overhead; I'd suggest a fairly benign schema change that also has little/no effect on table (eg, alter table #test replace fiId default null); obvious downside to this approach is the forced recompilation of the stored proc, which could add considerably to proc run times depending on volume/complexity of queries in the rest of the proc -
Sql server partition parent table and reference not partition child table
Hi,
I have two tables in SQL Server 2008 R2, Parent and Child Table.
Parent has date time, and it is partitioned monthly, there is a Child table which just refer the Parent table using Foreign key relation.
is there any problem the non-partitioned child table referring to a partitioned parent table?
Thanks,
AreefThe tables will need to be offline for the operation. "Offline" here, means that you wrap the entire operation in a transaction. Ideally, this transaction would:
1) Drop the foreign key.
2) Use ALTER TABLE SWITCH to drop the old data.
3) Use ALTER PARTITION FUNCTION to drop the old empty partition.
4) Use ALTER PARTITION FUNCTION to add a new empty partition.
5) Reapply the foreign keys WITH CHECK.
All but the last operation are metadata-only operation (provided that you do them right). To perform the last operation, SQL Server must scan the child tbale and verify that all keys are present in the parent table. This can take some time for larger tables.
During the transaction, SQL Server holds Sch-M locks on the table, which means that are entirely inaccessible, even for queries running with NOLOCK.
You avoid this the scan by applying the fkey constraint WITH NOCHECK, but this can have impact on query plans, as SQL Server will not consider the constraint as trusted.
An alternative which should not be entirely dismissed is to use partitioned
views instead. With partitioned views, the foreign keys are not an issue, because each partition is a pair of tables, with its own local fkey.
As for the second question: it appears to be completely pointless to partition the parent, but not the child table. Or does the child table only have rows for a smaller set of the rows in the parent?
Erland Sommarskog, SQL Server MVP, [email protected] -
Avoiding Z-table and having a global internal table during Sales Order
Hi All,
I have a requirement like this.
1. In the R/3 system I am creating Sales order.
2. For each line item, it will call APO system to check the availability of the materials and the informaiton is returned back to
R/3 system.
3. As the soon as the informaiton is recieved in R/3 system, we found one enhancement point and this informaiton is being captured now in a Z-table to do some processing while saving the order after processing all the line items
4. I wanted to avoid the Z-table and want to have some global internal table which will be available till the end of the processing of the sales order.
Solution I am thinking of:
1. One option could be creating a global internal table in SAPMV45A program in one of the enhancement points in the TOP declaration. But, that global internal table is not accessible in the enhancement point where I wanted to store the information. Because, I am actually updating the table in a FM.
2. Export it to memory and import it when needed. But, how to update the informaiton in this intenal table(which is in memory) for every line item
Please guide me. Any help on this would be highly appreciated.
Thanks,
Babu Kilari>
Babu Kilari wrote:
> Solution I am thinking of:
>
> 1. One option could be creating a global internal table in SAPMV45A program in one of the enhancement points in the TOP declaration. But, that global internal table is not accessible in the enhancement point where I wanted to store the information. Because, I am actually updating the table in a FM.
> Babu Kilari
If you are updating this table in a FM, you can always add one tables parameter to the function ( if it is a custom function) and pass the globally declared internal table to the function call in the user exit update the table in the FM and when the end function is reached, you will have the updated internal table again at the user exit after the function call.
Imp- Declare your internal table in MV45ATZZ.
KR,
Advait
Edited by: Advait Gode on Aug 6, 2009 8:16 AM -
After submitting a podcast, it only shows one episode and never updates?
The podcast was approved on the 17th September, but it only ever showed one episode, and has never updated. Can anyone help?
https://itunes.apple.com/gb/podcast/bethany-church-warrington/id562744688 pulls a feed from http://bethanywarrington.com/blog/?feed=rss2&cat=4
The feed is generated from Wordpress. ThanksThank you, I think you're right. The 12th september one has an extra line in the RSS:
<enclosure url="http://bethanywarrington.com/audio/120909%20Sermon%20John%20Addison.mp3" length="0" type="audio/mpeg" />
Every post should have this, because they all have links to MP3 files. The only weird thing is, I don't know how made this one post different, because I always submit every MP3 the same way with a link in the main body of text. How do you think I could add that into every post? -
Creating a table and content updation in Java Stack
Hi All
I have a requirement that i need to capture the information of deployed content in a deploy log file present in the directory of a J2EE Engine and store it in a JAVA Persistence stack.I should not use any SAP tables so the JCO and RFC approch is not helpful in my case.Is it possible to create a table in the Java persistence layer (NetWeaver DB partition for Java).
So can any body helps me how to crate a table and update the content.As of now I am able to capture the log info.
Some body suggested me to use open SQL for Java.But I am very new to this area.
Any suggetions and code helps me to resolve this issue.
Regards
KalyanYes your assumption is correct
What you need is a foreach loop based on ADO.Net enumerator which iterates through an object variable created in SSIS
The object variable you will populate inside execute sql task using query below
SELECT Col1,Col2
From Table2
Have two variables inside loop to get each iterated value of col1 and col2
Then inside loop have a data flow task with oledb source and flat file destination
Inside OLEDB Source use query as
SELECT *
FROM Table1
WHERE col1 = ?
Map parameter to Col1 variable inside loop
Now link this to flat file destination
Have a variable to generate filename using expression below
@[User::Col2] + (DT_STR,1,1252) "\\" + (DT_STR,10,1252) @[User::Col1] + ".txt"
Map this filename variable to connection string property of the flat file connection manager
Once executed you will get the desired output
Please Mark This As Answer if it solved your issue
Please Mark This As Helpful if it helps to solve your issue
Visakh
My MSDN Page
My Personal Blog
My Facebook Page -
Polling the master detail table and to update the LAST_UPDATED with SYSDATE
Hi
The requirement is polling the master detail table where read_flag is null and has to update the LAST_UPDATED with SYSDATE in both tables.
Refered the MasterDetail and PollingPureSQLSysdateLogicalDelete samples of SOASuite.
Used the delete polling strategy in polling process and modified the generated TopLink discriptor as follows.
set the TopLink -> Custom SQL tab -> Delete tab with the following query
for master table (RECEIVER_DEPT) :
update RECEIVER_DEPT set READ_FLAG= 'S' , LAST_UPDATED=sysdate where DEPTNO=#DEPTNO
set the TopLink -> Custom SQL tab -> Delete tab with the following query
for Detail table (RECEIVER_EMP):
update RECEIVER_EMP set LAST_UPDATED=sysdate where EMPNO=#EMPNO
After deploying the bpel process data is updated in master(RECEIVER_DEPT) table with LAST_UPDATED as sysdate and read_flag as S
however data is deleted in detail(RECEIVER_EMP) table rather than updated records.Xtanto,
I suggest using JSP / Struts. UIX will be replaced by ADF Faces in JDeveloper 10.1.3 and thus I wouldn't suggest new developments to be started with UIX unless time doesn't allow to wait for ADF Faces. In this case develop UIX in an MVC1 model, using the UIX events for navigation because this model seems more likely to be mgratable, according to the UIX Statement of direction on Otn.
Back to your question. You can create a search form in JSP that forwards the request to a StrutsData Action to set the scope of teh result set. The read only table can have a link or a button to call the detail page, passing the RoewKey as a string.
Have a look at the Oracle by Example (OBE) tutorials that contain similar exaqmples.
Frank -
How to compare two internal table and store value in third table
Dear All,
There is two tabel say I_T1 & T2, in I_T1 there are 4 rows , in T2 there are multiple rows against the same field. Say there is a filed X common in both table .
Value of I_T1-X are
10
20
50
90
and value ot T2-X are
10
15
20
30
40
50
.100
Now i want to fetch data form T2 against the common field of I_T1-X and store into other internal table.
Plz suggest me the proper way of doing this.
Rewards points assured for proper answer.
Regards,
Gulrez Alamhi this is like your requirement.
in this i am storing the values into the final table
REPORT ZZZZ000000.
tables:mara,marc,mard,makt.
data:begin of it_mara occurs 0,
matnr like mara-matnr,
mtart like mara-mtart,
meins like mara-meins,
end of it_mara.
data:begin of it_marc occurs 0,
matnr like marc-matnr,
pstat like marc-pstat,
werks like marc-werks,
end of it_marc.
data:begin of it_mard occurs 0,
werks like mard-werks,
lgort like mard-lgort,
labst like mard-labst,
end of it_mard.
data:begin of it_final occurs 0,
matnr like mara-matnr,
mtart like mara-mtart,
meins like mara-meins,
pstat like marc-pstat,
werks like marc-werks,
lgort like mard-lgort,
labst like mard-labst,
maktx like makt-maktx,
end of it_final.
select-options:s_matnr for mara-matnr.
select matnr
mtart
meins
from mara
into table it_mara
where matnr in s_matnr.
if not it_mara[] is initial.
select matnr
pstat
werks
from marc
into table it_marc
for all entries in it_mara
where matnr = it_mara-matnr.
if not it_marc[] is initial.
select werks
lgort
labst
from mard
into table it_mard
for all entries in it_marc
where werks = it_marc-werks.
endif.
endif.
loop at it_mara.
it_final-matnr = it_mara-matnr.
it_final-mtart = it_mara-mtart.
it_final-meins = it_mara-meins.
read table it_marc with key matnr = it_mara-matnr.
it_final-werks = it_marc-werks.
it_final-pstat = it_marc-pstat.
read table it_mard with key werks = it_marc-werks.
it_final-lgort = it_mard-lgort.
it_final-labst = it_mard-labst.
if sy-subrc = 0.
select maktx from makt into it_final-maktx where matnr = it_final-matnr.
endselect.
endif.
append it_final.
endloop.
loop at it_final.
write:/ it_final-matnr under 'material',
it_final-mtart under 'material type',
it_final-meins under 'unit of measure',
it_final-werks under 'plant' ,
it_final-pstat under 'status',
it_final-lgort under 'storage loc',
it_final-labst under 'stock',
it_final-maktx.
endloop.
reward points if useful,
venkat. -
How do I read from one table and write to another identical table?
I am very new to Oracle. I am trying to do something that should be very simple.
I am trying to read from one table in SQL and then write to another
Identically formatted table. I keep getting various errors. Could someone please
post some vey simple code that will work so that I can play around with it?
Any help would be greatly appreciated.
Thanks,
RonThanks, but I must be missing something.
I have two tables, SONGLIST and SETLIST.
The second line by itself works just fine on either table.
Here is the code I used following your seggestion, along with it's error message.
Hope you can help. Thanks again...
INSERT INTO SETLIST
SELECT TITLE FROM SONGLIST WHERE ROTATION <> 'X'
ORA-00947: not enough values
Maybe you are looking for
-
Daughter has new touch 4G, I have iPhone 4,can we have iTunes on 2 computer
Hi my daughter has new touch 4G, I have iPhone 4, can we have iTunes on 2 computers with the same apps already bought? What I mean is currently I have an iMac with iTunes and my iPhone 4 on. bought a lot of apps etc. For Xmas I synced it to this iMac
-
Multiplexing error/medium write error/HELP!
When i try to burn my DVD in iDVD it will get to the burn process then kick the DVD out and give me a multiplexing error. So, I tired to use disk utilities and I was able to make the movie, but when i tried to burn it through there I got an error mes
-
Let me preface this by saying I have NOT gone through and added personal photos to contacts yet. What I presently am trying to do is simply sync my social media contacts (specifically Facebook) to the names/numbers that are in my contacts list and th
-
R6031 attempt to initialize the crt more than once
I am getting this runtime error frequently in my Windows 10 test environment. I am running it on a Xenserver 6.4 environment, so it may be something with the drivers there, but wanted to see if anyone else has this issue. So far I have seen it with c
-
Hello I have <af:outputText value="#{bindings.link.inputValue}" id="ot2"/>. How to make it as a HyperLink?? When I Convert to Hyper Link there is error. MK