Database Aplication Tables Reconciliation in real time
Hi.
Is there a way to get database aplication tables trusted reconciliation in real time?
I mean, If I add/update a row to a table in the trusted source, some type of agent calls an API in OIM which gets the added/updated row (and creates the user)
Thank You.
DBAT Connector will not fulfil your requirement.
Please see the below link. It may help you in developing what you want. You can call OIM Recon APIs through DB Trigger
http://www.cs.umbc.edu/portal/help/oracle8/java.815/a64686/04_call2.htm
Similar Messages
-
TDMS database and treat its values as real time data.
Hi,
I'm newbie in using labview. Actually I wanna import data from TDMS database and treat it as real time data. Although I have done the first part which is importing the data inside the program by reading TDMS and selecting channels and groups, I couldn't find way for second part which is treating data as real time one. I tried Build Waveform, but it didnt work. Would you please help me to solve this problem ? I appreciate your helps.reza_amin,
I don't understand your word "build waveform". If you write waveform data to a tdms file, you will read waveform data directly from the tdms file.
If you are using LabVIEW 2012 (download from http://www.ni.com/trylabview), you could find a tdms example VI in "C:\Program Files\National Instruments\LabVIEW 2012\examples\file\plat-tdms.llb\TDMS - Concurrent File Access.vi". This example VI demonstrates how to write Sine Waveform into a tdms file and read the Sine Waveform from the tdms file.
Hope this helps and enjoy your LabVIEW journey. :-)
Best Regards,
Bo Xie -
UCCX 7 (SR4) - real time reporting based on option choosed
hello,
i would like to extract real time information on wich option the client choosed on my uccx7 script. I know where to get in the database some info from the queues (Summary Tables), and i would like to make some custom real time reporting based on wich option was choosed to know how many callers choosed the voicemail option (so the call won't show up as abandoned).
Is there any way to get real time info from a custom variable in real time? in wich table?
thank,
met.IVR selections are not recorded within the CCX databases. You would need to save this to an external database and report from there. Your only method of capturing this within db_cra would be to store a value into an enterprise data variable which can be seen on historical reports.
Also, the only tables that are "real time" are those identified as such: RtCSQsSummary and RtICDStatistics for use by wallboard applications. -
Hello
Oracle 10.2.0.1
I have dataguard configuration with standby file management auto and real time apply
SQL> SELECT DEST_ID, RECOVERY_MODE FROM V$ARCHIVE_DEST_STATUS
where dest_id=2;
DEST_ID RECOVERY_MODE
2 MANAGED REAL TIME APPLY1-) I create a table in primary database
2-) I cancel recovery in standy database (recover managed standby database cancel;)
3-) I open standby database in read only
Since real time apply is enabled why I cannot see my new created table in standby database?Khurram;
When I change the protection mode, I got the below alerts, therefore I couldnt open the database:
LGWR: Primary database is in MAXIMUM PROTECTION mode
LGWR: Destination LOG_ARCHIVE_DEST_2 is not serviced by LGWR
LGWR: Destination LOG_ARCHIVE_DEST_1 is not serviced by LGWR
LGWR: Minimum of 1 LGWR standby database required
Mon Sep 28 20:19:15 2009
Errors in file /oracle/u01/admin/orclprod/bdump/orclprod_lgwr_29882.trc:
ORA-16072: a minimum of one standby database destination is required
Mon Sep 28 20:19:15 2009
Errors in file /oracle/u01/admin/orclprod/bdump/orclprod_lgwr_29882.trc:
ORA-16072: a minimum of one standby database destination is required
LGWR: terminating instance due to error 16072
Instance terminated by LGWR, pid = 29882 -
Is there a way to create dependency on the real-time jobs
Hi,
We have around 80 real-time services running and loading the changed data into the target.
The process being used is
IBM Informix > IBM CDC > JMS (xml messages) > DS real-time services > Oracle EDW.
While using the above process, when ever there is change in the fact table and the dimension table, both the real-time services are loading the data at the same time into the target. This is causing issues in looking up data with the timing issue.
Is there a way where we can create a dependency and resolve the timing issue and make sure the lookup table is loaded and then the master table is loaded?
Please let me know.
Thanks,
CHello
With the design you curently have, you will have potential sequencing issues. There is no magic in Data Services to solve this.
You might want to consider building more complex real-time jobs that accept more complex data structures and have logic to process the data in dependency order.
Michael -
Recovery_mode managed vs managed real time apply
Hello All,
I am using Oracle 11.2.0.3
My primary database is an Oracle RAC 2 nodes database with ASM and my standby database is a single instance physical standby database on file systems.
My protection mode is MAXIMUM PERFORMANCE
What is the difference I do :
alter database recover managed standby database using current logfile disconnect; (managed real time apply)
and
alter database recover managed standby database disconnect; (managed)
Does managed real time apply have any performance impact or disadvantage?
Regards,Does managed real time apply have any performance impact or disadvantage?
Performance i don't think will not be..
For the rest of the queries you have to deside what is real use of it..? I will so you path of pros and cons of it
Difference between real time apply and active dataguard
What are the pros and cons using Active Data Guard vs Data Guard?
redo apply Vs real time apply -
ORACLE 11G "real time apply" not work?????
we have a database original on ORACLE 10.2.0.4 and we upgrade it to 11.1.0.7.
after that we create standby database and tried to use "real time apply" feature.
Primary database can transfer log files to standby database and standby database also can apply logs. The problem is it can NOT work on "real time apply".
Ant ideal what wrong?
=== procedures ====== (standby database)
SQL> startup mount;
ORACLE instance started.
Total System Global Area 2087780352 bytes
Fixed Size 2161272 bytes
Variable Size 1795163528 bytes
Database Buffers 251658240 bytes
Redo Buffers 38797312 bytes
Database mounted.
SQL> alter database open read only;
Database altered.
SQL> alter database recover managed standby database using current logfile disconnect;
Database altered.
SQL> select PROTECTION_MODE, PROTECTION_LEVEL, DATABASE_ROLE, SWITCHOVER_STATUS, OPEN_MODE, GUARD_STATUS from v$database;
PROTECTION_MODE PROTECTION_LEVEL DATABASE_ROLE SWITCHOVER_STATUS
OPEN_MODE GUARD_S
MAXIMUM PERFORMANCE MAXIMUM PERFORMANCE PHYSICAL STANDBY NOT ALLOWED
MOUNTED NONE
SQL> select process, status from v$managed_standby;
PROCESS STATUS
ARCH CONNECTED
ARCH CONNECTED
ARCH CONNECTED
ARCH CONNECTED
RFS IDLE
MRP0 APPLYING_LOG
6 rows selected.
========== Primary database init.ora file setup =====
### for DG use
db_unique_name = DBPMY
log_archive_config='dg_config=(DBPMY,DBSBY)'
log_archive_dest_1='LOCATION=/Archive/DBPMY/arch/arch MANDATORY'
log_archive_dest_2='service=DBSBY valid_for=(online_logfiles,primary_role) db_unique_name=DBSBY LGWR ASYNC=20480 OPTIONAL REOPEN=15 NET_TIMEOUT=30'
*.log_archive_format='DBPMY_%r_%t_%s.arc'
log_archive_dest_state_1 = enable
log_archive_dest_state_2 = enableThere are a couple of things to look at.
1. Real time apply requires standby redo logs on the standby database. On the standby database run this query:
SELECT * FROM v$logfile where type = 'STANDBY';
if you get 0 rows back you'll need to create standby logfiles
The general guideline is to size them exactly like your redo logs but add one additional standby log to ensure it doesn't cause a bottleneck.
2. Get the size of your logfiles:
SELECT GROUP#, BYTES FROM V$LOG;
3. For example if you have 3 redo logs that are 50 MB in size, create 4 standby redo logs 50 MB each and don't multiplex them.
ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog1.rdo') SIZE 50M;
ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog2.rdo') SIZE 50M;
ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog3.rdo') SIZE 50M;
ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog4.rdo') SIZE 50M;
4. Cancel recovery on standby
recover managed standby database cancel;
5. Restart recovery using real time apply
recover managed standby database using current logfile disconnect;
6. To validate that real time is working you can check a few places.
-It will say in the database alert log on standby that it's using real time apply
OR
-Check primary
SELECT status, recovery_mode FROM v$archive_dest_status where dest_name = 'LOG_ARCHIVE_DEST_2';
If the recovery_mode is "MANAGED REAL TIME APPLY" then real time apply is working, if it's anything else then we'll need to check more things.
NOTE that if you are going to allow your current primary to switch roles and become a standby then you'll want to create standby redo logs on primary as well
Sometimes recovery gets "stuck" and simply resetting the destination parameters can resolve it:
alter system set log_archive_dest_2='service=DBSBY valid_for=(online_logfiles,primary_role) db_unique_name=DBSBY LGWR ASYNC=20480 OPTIONAL REOPEN=15 NET_TIMEOUT=30';
There are some other things we can check next but let's start with the easiest fixes first. -
Hi Friends
We have fi-gl and ec-pca data loads currently with period selection full loads .
Business wanted the real time data for their reporting. Our businessdont close the posting periods. They still do postings for 2008 periods and postings happens around the clock over the month end.
We have checked direct access but didn't work.
Is RDA (real time acquicition) works with FI-GL models? I know RDA doesn't work for 0FI_GL_4 datasource.
Is it possible with generic delta method by conatenating cpudt and cputm from bkpf table and use timestamp as a delta enable field?
Please your quick response will be appreciated..
Regards,
Chandu.The answer for your question on how to find if a data source is RDA
enabled is given below .
Use SE16 to browse the details of your datasources in ROOSOURCE table
If the 'Real-Time Enabl' shows an 'X'it means your datasources are real-
time compatible
See this link .
http://help.sap.com/saphelp_nw2004s/helpdata/
en/52/777e403566c65de10000000a155106/frameset.htm
But if the data source is not real time enabled then you cannot see
that option in the DTP .
SAP has not released the FI-GL datasources with RDA feature, if you
check the delievered version of the FI-GL datasources you will see
that the flag 'Realtime-enabled' is missing. No date has been set yet
for when these datasources will be RDA enabled.
2:
If you really need to use the RDA functionality with the FL-GL
datasources you can use the following workaround:
A: In the table ROOSOURCE set the Flag 'Realtime' to 'X' for
the datasource you want to RDA enable.
To set the RDA flag for the datasource you can use a program like
the following:
REPORT Z_RDA_FLAG.
DATA: ROOSOURCE.
Update ROOSOURCE set REALTIME = 'X'
where OLTPSOURCE = 'put the name of the datasource here'
and OBJVERS = 'A'.
B. Replicate the DataSource into BW
C. Create a DSO Object
D. Migrate the DataSource (right mouse on DataSource in RSA1 DataSource
overview)
E. Create transformation.
F. Create DTP of type 'Realtime'
G. Create InfoPackage for Init-Simulation (Delta-init load without data
transfer) and trigger the Request for the data load
H. Create another InfoPackage for Realtime data acquisition
I. Create a demon directly from InfoPackage maintenance or via
transaction RSRDA. Assign DataSource and Realtime DTP to demon. Start
demon; upload period is 1 minute.
3: The steps in point 2 above is a workaround so we do not support it as
it is not part of SAP standard functionality.However if you test this
carefully in your DEV and QA environments you should find that the
Process works.
PLEASE PAY ATTENTION TO THE FOLLOWING INFORMATION REGARDING RDA:
1. RDA can only be used for completely new scenarios! The reason is that
you need a DataSource of the 7.0 type, not of 3.x type. Since you
migrate a 3.x DataSource into a 7.0 DataSources, the existing transfer
rules and mapping between DataSources and InfoSources are lost (deleted)
or archived. It can destroy your existing scenario. Please be very
cautious and use the Realtime Data Acquisition functionality on a
complete new scenario. Make sure that the DataSource in question you
would like to migrate or to replicate as a 7.0 DataSource, is not
used for other Data targets in your BW system!
2. Be aware of the fact that as soon as you decide to use RDA method,
you cannot load delta via the 'usual' delta upload for the same
scenario anymore. It is not possible to switch between RDA demon usage
and a non-RDA delta loads in the same data flow.
At the moment, no Content-delivered DataSources can be used for RDA
in Standard because the flag 'Realtime-enabled' is missing in the
delivered version. You'd better to wait for 'Realtime' Flag delivering
in DataSource standard done by the corresponding application.
regards,
Colin Moloney -
How can I generate a real-time highchart from my database data?
I have looked several links; however, I couldn't find a working demo showing how to implement a highchart using data from a database.
Objective: I want to generate a real time highchart line graph getting data from my database. What I want is very similar to the
HighChart Demo which provides a real-time highchart with randomly generated values. It is also similar by X-axis and Y-axis, for I want my x-axis to be "Time" (I have a DateTime column in my database) and y-axis to be an integer (I have
a variable for that as well in my database).
Please I need help in sending the model data to my razor view.
Note that I am already using SignalR to display a realtime table. I also want to know if it can be used to automatically update the highchart as well.
Below is the code snippet of my script in the view. I have used the code provided in
HighChart Demo link for generating the highchart. Please tell me where should I apply the changes on my code.
@section Scripts{
<script src="~/Scripts/jquery.signalR-2.2.0.js"></script>
<!--Reference the autogenerated SignalR hub script. -->
<script src="~/SignalR/Hubs"></script>
<script type="text/javascript">
$(document).ready(function () {
// Declare a proxy to reference the hub.
var notifications = $.connection.dataHub;
//debugger;
// Create a function that the hub can call to broadcast messages.
notifications.client.updateMessages = function () {
getAllMessages()
// Start the connection.
$.connection.hub.start().done(function () {
alert("connection started")
getAllMessages();
}).fail(function (e) {
alert(e);
//Highchart
Highcharts.setOptions({
global: {
useUTC: false
//Fill chart
$('#container').highcharts({
chart: {
type: 'spline',
animation: Highcharts.svg, // don't animate in old IE
marginRight: 10,
events: {
load: function () {
// set up the updating of the chart each second
var series = this.series[0];
setInterval(function () {
var x = (new Date()).getTime(), // current time
y = Math.random();
series.addPoint([x, y], true, true);
}, 1000);//300000
title: {
text: 'Live random data'
xAxis: {
type: 'datetime',
tickPixelInterval: 150
yAxis: {
title: {
text: 'Value'
plotLines: [{
value: 0,
width: 1,
color: '#808080'
tooltip: {
formatter: function () {
return '<b>' + this.series.name + '</b><br/>' +
Highcharts.dateFormat('%Y-%m-%d %H:%M:%S', this.x) + '<br/>' +
Highcharts.numberFormat(this.y, 2);
legend: {
enabled: false
exporting: {
enabled: false
series: [{
name: 'Random data',
data: (function () {
// generate an array of random data
var data = [],
time = (new Date()).getTime(),
i;
for (i = -19; i <= 0; i += 1) {
data.push({
x: time + i * 1000,
y: Math.random()
return data;
function getAllMessages() {
var tbl = $('#messagesTable');
var data = @Html.Raw(JsonConvert.SerializeObject(this.Model))
$.ajax({
url: '/nurse/GetMessages',
data: {
id: data.id,
contentType: 'application/html ; charset:utf-8',
type: 'GET',
dataType: 'html'
}).success(function (result) {
tbl.empty().append(result);
$("#g_table").dataTable();
}).error(function (e) {
alert(e);
</script>Hi Sihem,
Thank you for contacting National Instruments. Using the LabVIEW Real-Time module, you can do development without actually having a target. While viewing the project explorer window, you can do the following steps:
Right click on the project
Select New >> Targets and Devices
Select the "New Target or Device" radio button
Select the target you would like to develop on.Information about the LabVIEW Real-Time Module can be found here.
Regards,
Kevin H
National Instruments
WSN/Wireless DAQ Product Support Engineer -
Real time database, Logger, ICM version 8
Dear;
Currently I am checking the table t_Agent_Real_Time and the values are zeros, so I beleive that maybe the writing to the real time database is disabled. In ICM version 8, where I can enable or disable the writing to real time database at the logger from the Registry (because I forgot this)?
Regards
BilalDears;
It is resolved.
I just corrected the hostname for the HDS server that I have to place it in the Distributor settings at the CUCM PG in the Configuration Manager. It was keep having the old hostname that came from the migration tool.
Thanks
Regards
Bilal -
Real-time apply cascaded logical standby database
Hi
I have a primary database orcl
Pysical standby database orcl_std
Cascaded logical standby database orcl_tri which receives archivelogs from orcl_std
Real time apply is enabled both in orcl_std (physical standby) and orcl_tri (logical standby)
When I create a table in primary orcl, I am unable to see it on orcl_tri (Although real time apply is enabled)
However, when I switch log in primary, I can see the new table on orcl_tri.
My question is, why realtime apply is not working in my scenerio ?
orcl_std : ALTER DATABASE RECOVER MANAGED STANDBY DATABASE DISCONNECT FROM SESSION USING CURRENT LOGFILE;
orcl_tri: ALTER DATABASE START LOGICAL STANDBY APPLY IMMEDIATE;
Oracle 11.2.0.3.0Hi mseberg,
Thanks for your reply.
There is no load or network issue as I`ve just created these databases for the experiement.
I have the same output from standby and primary databases.
SQL> select bytes/1024/1024 from v$standby_log;
BYTES/1024/1024
10
10
10I can see below output in standby alertlog
Fri Nov 16 08:39:51 2012
ALTER DATABASE START LOGICAL STANDBY APPLY IMMEDIATE
ALTER DATABASE START LOGICAL STANDBY APPLY (orcl)
with optional part
IMMEDIATE
Attempt to start background Logical Standby process
Fri Nov 16 08:39:51 2012
LSP0 started with pid=37, OS id=16141
Completed: ALTER DATABASE START LOGICAL STANDBY APPLY IMMEDIATE
LOGMINER: Parameters summary for session# = 1
LOGMINER: Number of processes = 3, Transaction Chunk Size = 201
LOGMINER: Memory Size = 30M, Checkpoint interval = 150M
LOGMINER: SpillScn 1953318, ResetLogScn 995548
LOGMINER: summary for session# = 1
LOGMINER: StartScn: 0 (0x0000.00000000)
LOGMINER: EndScn: 0 (0x0000.00000000)
LOGMINER: HighConsumedScn: 1955287 (0x0000.001dd5d7)
LOGMINER: session_flag: 0x1
LOGMINER: Read buffers: 16
Fri Nov 16 08:39:55 2012
LOGMINER: session#=1 (Logical_Standby$), reader MS00 pid=30 OS id=16145 sid=49 started
Fri Nov 16 08:39:55 2012
LOGMINER: session#=1 (Logical_Standby$), builder MS01 pid=39 OS id=16149 sid=44 started
Fri Nov 16 08:39:55 2012
LOGMINER: session#=1 (Logical_Standby$), preparer MS02 pid=40 OS id=16153 sid=50 started
LOGMINER: Turning ON Log Auto Delete
LOGMINER: Begin mining logfile during commit scan for session 1 thread 1 sequence 202, +DATA/orcl_std/archivelog/2012_11_15/thread_1_seq_202.349.799450179
LOGMINER: End mining logfiles during commit scan for session 1
LOGMINER: Turning ON Log Auto Delete
LOGMINER: Begin mining logfile for session 1 thread 1 sequence 202, +DATA/orcl_std/archivelog/2012_11_15/thread_1_seq_202.349.799450179
LOGMINER: End mining logfile for session 1 thread 1 sequence 202, +DATA/orcl_std/archivelog/2012_11_15/thread_1_seq_202.349.799450179
Fri Nov 16 08:40:04 2012
LOGSTDBY Analyzer process AS00 started with server id=0 pid=41 OS id=16162
Fri Nov 16 08:40:05 2012
LOGSTDBY Apply process AS03 started with server id=3 pid=45 OS id=16175
Fri Nov 16 08:40:05 2012
LOGSTDBY Apply process AS04 started with server id=4 pid=46 OS id=16179
Fri Nov 16 08:40:05 2012
LOGSTDBY Apply process AS01 started with server id=1 pid=42 OS id=16167
Fri Nov 16 08:40:05 2012
LOGSTDBY Apply process AS05 started with server id=5 pid=47 OS id=16183
Fri Nov 16 08:40:05 2012
LOGSTDBY Apply process AS02 started with server id=2 pid=44 OS id=16171Do you think real-time apply wasnt setup properly ? -
HDS Real time tables shows negetive figures
cisco real time tables throwing strange number(figure) for offered calls every
mid night between 00:00 and 00:05. (Table name Call_type_real_Time)
Kindly note we have problem with the real time data in the CISCO database, it gives number of calls received in negative figures during early
hours of the day .Appreciate to comment if any one comes across the same ? Any suggesionour set up (version) as follows
ICM 7.0_SR4;CTIOS7.0(0)SR2& ES13/ES25
CCM4.1(3)SR4d
CVP3.1(0),SR2 & ES3 -
How to Integrate real time data between 2 database servers
How to Integrate real time data between 2 database servers
May 31, 2006 2:45 AM
I have a scenario where the data base (DB2 400) is maintained by AS 400 application and my new website application based on j2ee platform access the same database also but the performance is very low. So we have thought of introducing new oracle data base which will be accessed by j2ee application and all the data from db 400 database will be replicate to oracle data base. In that scenario the only problem is of real time data exchange between 2 databases. How do we achieve that considering both the application As400 and j2ee website application are running in parallel and accessing the same information lying on DB2 400 database. We have to look at transaction management also.
Thanks
Panky
DrClap
Posts:25,835
Registered: 4/30/99 Re: How to Integrate real time data between 2 database servers
May 31, 2006 11:16 AM (reply 1 of 2)
You certainly wouldn't use XML for this.
The process you're looking for is called "replication". Ask your database experts about it.
I predict that after you spend all the money to install Oracle and hire consultants to make it replicate the DB2/400 database, your performance problem will be worse.
panks
Posts:1
Registered: 5/31/06 Re: How to Integrate real time data between 2 database servers
May 31, 2006 11:55 PM (reply 2 of 2)
Yeajh I now that its not a XML solution.
Replication is one of the option but AS400 application which uses DB2/400 DB is highly loaded and proposed website also uses the same database for retrieval and updation purpose.All the inventory is maintained in the DB2/400 database so I have thought of introducing new oracle database which will be accessed by new website and it will have all the relevant tables structure along with data from DB2/400 application. Now whenever there is a order placement from new website then first it should update the oracle database and then this data shuold also migrate to db2/400 application at real time so that the main inventory which is lying on db2/400 should be updated on real time basis because order placement is aslo possible from As400 application. So the user from As400 application should not get the wrong data.
Is it possible to use MQ products??
-PankyHi,
the answer to your question is not easy. Synchronization or integration or replication data between 2 (or more) database servers is very complicated task, even though it doesn't look like.
Firstly I would recommend to create good analysis regarding data flow.
Important things are:
1) what is primary side for data creation. In other words on which side - DB2 or Oracle - are primary data (they are created here) and on which side are secondary data (just copies)
2) on which side are data changed - only in DB2 side or only on Oracle side or on both sides
3) Are there data which are changed on both side concurrently? If so how should be conflicts solved?
4) What does it mean "real time"? Is it up to 1 ms or 1s or 1 min or 1 hour?
5) What should be done when replication will not work? I mean replication crash etc.
BTW. The word "change" above means INSERT, UPDATE, DELETE commands.
Analysis should be done for every column in every table. When analysis is ready you can select the best system for your solution (Oracle replication, Sybase replication server, MQ, EJB or your proprietary solution). Without analysis it will be IMHO gunshot into the dark. -
OIM: Reconciliation with Database Application Tables Connector
Hi!
I'm trying to use Database Application Tables connector 9.1.0.5.0 in OIM 11g to reconcile accounts from my target system: MySQL 5.1.37. I followed the steps in connector's guide to create and configure it and to configure my target system as a trusted source.
The problem is that user accounts are not being created in OIM, despite of this, I know the connector is retrieving the information of each user because I can see it in the log's messages generated when I execute the reconciliation job, so I supossed the problem may be in the "Modify Connector configuration page" or in some step after this.
Aditionally, I don't know if it's important but I can't see "User Type" field in "OIM user account data set" from "Modify Connector Configuration Page".
Did I forget to do something in configuration or there is something I got to do besides the steps in guide?
Thanks in advance!
Edited by: user10857411 on Jan 11, 2011 4:10 PM
Edited by: user10857411 on Jan 11, 2011 4:12 PMSUN IDM is better than OIM,
Reconciliation process in Sun IDM (oracle waveset) is easier to implement than OIM (worst identity solution)
Saludos Cordiales Zam -
Hello
I have a VeriStand-Project (VSP) created with my Laptop-Host (LTH) which works with my PXI, while
deploying it from my LTH. Then I have installed the whole NI enviroment for PXI and VeriStand use on a
industrial PC (iPC). I have tried to deploy my VSP from the iPC to the PXI but the following error
message arose on my iPC:
The VeriStand Gateway encountered an error while deploying the System Definition file.
Details: Error -1074384569 occurred at Project Window.lvlibroject Window.vi >> Project
Window.lvlib:Command Loop.vi >> NI_VS Workspace ExecutionAPI.lvlib:NI VeriStand - Connect to System.vi
Possible reason(s):
NI-XNET: (Hex 0xBFF63147) The database information on the real-time system has been created with an
older NI-XNET version. This version is no longer supported. To correct this error, re-deploy your
database to the real-time system. ========================= NI VeriStand: NI VeriStand
Engine.lvlib:VeriStand Engine Wrapper (RT).vi >> NI VeriStand Engine.lvlib:VeriStand Engine.vi >> NI
VeriStand Engine.lvlib:VeriStand Engine State Machine.vi >> NI VeriStand Engine.lvlib:Initialize
Inline Custom Devices.vi >> Custom Devices Storage.lvlib:Initialize Device (HW Interface).vi
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * • Unloading System
Definition file... • Connection with target Controller has been lost.
The software versions of the NI products (MAX/My System/Software) between my LTH and the iPC are
almost the same. The only differences are:
1. LabView Run-Time 2009 SP1 (64-bit); is installed on LTH but missing on iPC. The iPC has a 32-bit system.
2. LabView Run-Time 2012 f3; is installed on LTH but missing on iPC.
3. NI-DAQmx ADE Support 9.3.5; something strage on the LTH, because normally I am using NI-DAQmx 9.5.5 and all other DAQmx products on my LTH are 9.5.5. That means NI-DAQmx Device Driver 9.5.5 and NI-DAQmx Configuration 9.5.5.. On the iPC side all three products are 9.5.5.. That means NI-DAQmx ADE Support 9.5.5, NI-DAQmx Device Driver 9.5.5 and NI-DAQmx Configuration 9.5.5..
4. Traditional NI-DAQ 7.4.4; The iPC has this SW installed. On the LTH this SW is missing.
In order to fix this problem I have formatted my PXI and I have installed the following SW from the iPC:
1. LabVIEW Real-Time 11.0.1
2. NI-488.2 RT 3.0.0
3. NI_CAN 2.7.3
Unfortunately the above stated problem still arose.
What can I do to fix this problem?
I found a hint on http://www.labviewforum.de/Thread-XNET-CAN-die-ersten-Gehversuche.
There it is written to deploy the dbc file againt.
If this is a good hint, so how do I deploy a dbc file?
I would feel very pleased if somebody could help me! :-)
Best regards
Lukas NowakHi Lukas,
I think the problem is caused by differenet drivers for the CAN communication.
NI provides two driver for CAN: NI-CAN and NI-XNET.
NI-CAN is the outdated driver which is not longer used by new hardware. NI replaced the NI-CAN driver with NI-XNET some years ago, which supports CAN, LIN and the FLEXRAY communication protocol.
You wrote:
In order to fix this problem I have formatted my PXI and I have installed the following SW from the iPC:
3. NI_CAN 2.7.3
NI CAN is the outdated driver. I think that you should try to install NI-XNET instead of NI-CAN on your PXI-System, to get rid of the error message.
Regards, stephan
Maybe you are looking for
-
NO LONGER ABLE TO UPLOAD "NO CONFIGURED COMPUTER FOUND ON THE NETWORK" ????
I am stumped because I have uploaded pictures for a long time with no trouble to my computer and it doesn't matter what program software I use whether its Picasa or I photo or any other I assume, I get this message: "no configured computer found on t
-
i have just purchased a new macbook air and installed mavericks but havent recived iwork for free and i need it to do uni work on in the store they said nce mavericks was installed this should come free with it how do i go about getting it as in app
-
Using two Time Capsules - 2nd as backup drive without internet
Let me explain the setup first: I have Motorola SurfBoard SBG6580 <-> to this is connected my NEW Time Capsule 3TB via Ethernet This new 3TB TC provides internet and Time Machine back ups to my iMac and Macbook. So far everything is good! Now, I have
-
Beginner here. I have written a dyno data acquisition program in LV 6.0 (WIN XP). It saves the data into a text file. (Several thousand lines, of 6 comma delimited rows. (the row data varies in length) Now I want to read that file and graph it. I am
-
Run java code on report completion
I understand that Business Objects allows for custom notifications when a scheduled report is completed successfully. Is there a way in Java to be notified when a report is successfully completed? I want to run some code upon report completion. Th