Wavefrom graph time from .csv file
Hello everyone. I had this problem something about three weeks ago:
http://forums.ni.com/t5/LabVIEW/Wavefrom-graph-time-from-csv/td-p/2256754
and solved it by changing the delimiter to a comma decimal delimitter with the format %,;%.2f
But this problem occurs again suddenly!.
I have attached the VI and the .csv file as a ZIP format
Best regards
Oesen
Attachments:
Trykmaaling_READ.vi 36 KB
ekstra.zip 1 KB
I have deleted "row 0" and the graph is better now, but is still incorrect
Best regards
Oesen
Similar Messages
-
Wavefrom graph time from .csv
Hello everyone
I use "read from spreadsheet file" to read a two column .csv file in a waveform graph.
Time is defined in the first column and the Data is defined in the second column.
The problem occurs in the x-axis of the graph (Time-axis). It doesn't match with the numbers from the .csv as shown on the picture below, but the Y-axis has correct values.
The VI is uploadet
Best regards
Oesen
Solved!
Go to Solution.
Attachments:
2-Column CSV.vi 13 KBWorks fine here once I configure it to use a comma as decimal delimiter (using format=" %,;%.3f " on read from spreadsheet file).
You seem to use a foreign version so that should not be necessary, depending on the language setting of your computer but it seems to be the problem.
As a first step, you should display the entire 2D array in an array indicator to ensure that your number don't get truncated somehow. If they get truncated, dt will be set as zero and thus ignored and taken as 1.
LabVIEW Champion . Do more with less code and in less time .
Attachments:
SetDt.png 23 KB -
Loading 361000 records at a time from csv file
Hi,
One of my collegue loaded 361000 records from one file file , how is this possible as excel accepts 65536 records in one file
and even in the infopackage the following are selected what does this mean
Data Separator ;
Escape Sign "
Separator for Thousands .
Character Used for Decimal Point ,
Pls let me knowhi Maya,
it just possible, other than ms-excel, we have editor like Textpad that support more 65k rows (and windows Notepad), the file may be generated by program or edited outside in excel, or newer version of excel is used, ms-excel 2007 support more 1 million rows.
e.g we have csv file
customer;product;quantity;revenue
a;x;"1.250,25";200
b;y;"5.5";300
data separator ;
- char/delimeter used to separate field, e.g
escape sign, e.g
- "1.250,25";200 then quantity = 1.250,25
separator for thousands = .
- 1.250,25 means one thousand two hundred ...
char used for decimal point
- - 1.250<b>,</b>25
check
http://help.sap.com/saphelp_nw70/helpdata/en/80/1a6581e07211d2acb80000e829fbfe/frameset.htm
http://help.sap.com/saphelp_nw70/helpdata/en/c2/678e3bee3c9979e10000000a11402f/frameset.htm
hope this helps. -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Importing users into WGM from csv file issues/crash
Hi,
i've been importing user information from csv files into WGM via the +server > import+ function .
It worked the first few times but now when i try the import progress bar pops up and promptly disappears without any thing importing.
i've tried restarts, new admin account, reinstalled WGM.
I've also trashed some pref but i don't really know which ones i should be losing.
The servers an OD master.
any help would be appreciated.
as a last resort what do i need to backup/save if i were to format/reinstall osx server? keeping my settings etc.....
thanks
paulWhat I did was:
Exported the user list, to create an XML file in the correct format.
Using this format, I created a spreadsheet in Excel (sorry Apple), and in the final column I created a field that concatenated the information I wanted in the ':' deliminated format of the previously export XML.
Then just copy and past via pico into a pure text file and imported that.
You have to be careful with comments in Passenger, using special characters (';!@#$%^ and others can cause the WGM to fail and crash. -
SQL bulk copy from csv file - Encoding
Hi Experts
This is the first time I am creating a PowerShell script and it is almost working. I just have some problems with the actual bulk import to SQL encoding from the text file since it replaces
special characters with a question mark. I have set the encoding when creating the csv file but that does not seem to reflect on the actual bulk import. I have tried difference scenarios with the encoding part but I cannot find the proper solution for that.
To shortly outline what the script does:
Connect to Active Directory fetching all user - but excluding users in specific OU's
Export all users to a csv in unicode encoding
Strip double quote text identifiers (if there is another way of handling that it will be much appreciated)
Clear all records temporary SQL table
Import records from csv file to temporary SQL table (this is where the encoding is wrong)
Update existing records in another table based on the records in the temporary table and insert new record if not found.
The script looks as the following (any suggestions for optimizing the script are very welcome):
# CSV file variables
$path = Split-Path -parent "C:\Temp\ExportADUsers\*.*"
$filename = "AD_Users.csv"
$csvfile = $path + "\" + $filename
$csvdelimiter = ";"
$firstRowColumns = $true
# Active Directory variables
$searchbase = "OU=Users,DC=fabrikam,DC=com"
$ADServer = 'DC01'
# Database variables
$sqlserver = "DB02"
$database = "My Database"
$table = "tblADimport"
$tableEmployee = "tblEmployees"
# Initialize
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
# GET DATA FROM ACTIVE DIRECTORY
# Import the ActiveDirectory Module
Import-Module ActiveDirectory
# Get all AD users not in specified OU's
Write-Host "Retrieving users from Active Directory..."
$AllADUsers = Get-ADUser -server $ADServer `
-searchbase $searchbase -Filter * -Properties * |
?{$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com' `
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'}
Write-Host "Users retrieved in $($elapsed.Elapsed.ToString())."
# Define labels and get specific user fields
Write-Host "Generating CSV file..."
$AllADUsers |
Select-Object @{Label = "UNID";Expression = {$_.objectGuid}},
@{Label = "FirstName";Expression = {$_.GivenName}},
@{Label = "LastName";Expression = {$_.sn}},
@{Label = "EmployeeNo";Expression = {$_.EmployeeID}} |
# Export CSV file and remove text qualifiers
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
Write-Host "Removing text qualifiers..."
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Write-Host "CSV file created in $($elapsed.Elapsed.ToString())."
# DATABASE IMPORT
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
$batchsize = 50000
# Delete all records in AD import table
Write-Host "Clearing records in AD import table..."
Invoke-Sqlcmd -Query "DELETE FROM $table" -Database $database -ServerInstance $sqlserver
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable and autogenerate the columns
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumns -eq $true) { $null = $reader.readLine()}
Write-Host "Importing to database..."
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
# Clean Up
Write-Host "CSV file imported in $($elapsed.Elapsed.ToString())."
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
# Update tblEmployee with imported data
Write-Host "Updating employee data..."
$queryUpdateUsers = "UPDATE $($tableEmployee)
SET $($tableEmployee).EmployeeNumber = $($table).EmployeeNo,
$($tableEmployee).FirstName = $($table).FirstName,
$($tableEmployee).LastName = $($table).LastName,
FROM $($tableEmployee) INNER JOIN $($table) ON $($tableEmployee).UniqueNumber = $($table).UNID
IF @@ROWCOUNT=0
INSERT INTO $($tableEmployee) (EmployeeNumber, FirstName, LastName, UniqueNumber)
SELECT EmployeeNo, FirstName, LastName, UNID
FROM $($table)"
try
Invoke-Sqlcmd -ServerInstance $sqlserver -Database $database -Query $queryUpdateUsers
Write-Host "Table $($tableEmployee) updated in $($elapsed.Elapsed.ToString())."
catch
Write-Host "An error occured when updating $($tableEmployee) $($elapsed.Elapsed.ToString())."
Write-Host "Script completed in $($elapsed.Elapsed.ToString())."I can see that the Export-CSV exports into ANSI though the encoding has been set to UNICODE. Thanks for leading me in the right direction.
No - it exports as Unicode if set to.
Your export was wrong and is exporting nothing. Look closely at your code:
THis line exports nothing in Unicode"
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
There is no input object.
This line converts any file to ansi
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Set-Content defaults to ANSI so the output file is converted.
Since you are just dumping into a table by manually building a recorset why not just go direct. You do not need a CSV. Just dump theresults of the query to a datatable.
https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
This script dumps to a datatable object which can now be used directly in a bulkcopy.
Here is an example of how easy this is using your script:
$AllADUsers = Get-ADUser -server $ADServer -searchbase $searchbase -Filter * -Properties GivenName,SN,EmployeeID,objectGUID |
Where{
$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com'
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'
} |
Select-Object @{N='UNID';E={$_.objectGuid}},
@{N='FirstName';Expression = {$_.GivenName}},
@{N='LastName';Expression = {$_.sn}},
@{N=/EmployeeNo;Expression = {$_.EmployeeID}} |
Out-DataTable
$AllDUsers is now a datatable. You can just upload it.
¯\_(ツ)_/¯ -
Unable to synchronise Contacts to BB from CSV file
I'm trying to upload my contacts from csv file without success. I constantly receive an error message. I'm using
BB CUrve 8900 and I have a version 6 of Desktop Manager.
My operating system is Win 7 64bit. I've enabled logging and in the
Pttrace.log I constatnly see
"00:14:59.872: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:00.523: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:02.016: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:31.014: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:43.051: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:43.680: Fetching implemented classes at .\iluptbl.cpp line 3408
00:15:44.506:
00:15:44.507: Begin ILX Session, Source=Importer/Eksporter ASCII, Target=Urządzenie
00:15:44.531:
00:15:44.532: Starting One-Way Sync (from scratch)
00:15:44.803: Phase=10, User=22E867D2: Reading data from Urządzenie Książka adresowa
00:15:45.111: Finished Reading 17 Records + 0 Deletes + 0 Unchangeds for User=22E867D2 from Urządzenie Książka adresowa (slow sync input for resync)
00:15:45.112: Phase=20, User=22E867D2: Reading data from Importer/Eksporter ASCII Książka adresowa
Tue Apr 26 00:15:45 2011: Error Blad interfejsu API..4238 at .\AsciiConnectorIConnector.cpp line 375
Tue Apr 26 00:15:45 2011: Error 4107.4107 at .\sdk_data.cpp line 875
Tue Apr 26 00:15:45 2011: Error 0.4107 at .\ciltrans.cpp line 241
Tue Apr 26 00:15:45 2011: Error 4107.4107 at .\Ilx_sdk.cpp line 220
00:15:45.134: Translation Unit Status: User=22E867D2, rc=87, Phase=20, TrErr=4107, SysErr=0 at .\xlatev3.cpp line 650
00:15:45.136:
00:15:45.137: End ILX Session, elapsed time = 0.000 seconds
In Tif.log i see that program is successfully exporting data from device to tif file but when it attempts to open
csv file it fails
"-------- ilsdk finished processing a_record_for_export_to_TIF, rc=0, nRc=0, action=19
-------- ilsdk starting assembly of a_record_for_export_to_TIF
-------- END OF FILE
-------- ilsdk finished processing a_record_for_export_to_TIF, rc=0, nRc=4006, action=3
-------- Ending Load From Target phase; starting next phase
Finished Reading 17 Records + 0 Deletes + 0 Unchangeds for User=22E867D2 from Urządzenie Książka adresowa (slow sync input for resync)
ILTIFReopenFile/NO-OP
-------- Ending next phase; starting Load From Source phase
Could you help me as entering over 600 contacts would be painfull
brgds
WIesiekThis is exactly what I am dealing with. Blackberry Link will work great for awhile and then something happens and I can not sync my contacts. I have spent many hours and days trying to fix this to no avail. BB support has been great but I still go through a long process each time. This week I finally gave up and started using the Microsoft Hotmail Outlook Connector since my company does not offer Active Sync or Exchange. It works flawlessly. I would dedfinitely check it out. If you swith to a Widows Phone or Android this is one of the options they recommend you use.
In a nutshell Link is awful and BB should be ashamed of the product they put on the market. I will only use it for syncing pictures and documents. Good luck! -
Display data from CSV file in iWeb page
Hi,
I like to display data from a CSV file in iWeb page if a date value from CSV file matches todays value from the system. Here is an example.
CSV data values
01/20/2011,Sunny,87
01/21/2011,Cloudy,100
01/22/2011,Rainy,60
If today's date value is 01/21/2011 the page should display 01/21/2011 Cloudy 100 in a tabular format.
Appreciate your help in providing HTML code for this issue.
ThanksI suspect there is a soft return in the excel database somewhere that can't be seen. Take the csv/txt file into notepad and look for a line that starts oddly compared to the others.
I haven't had luck removing soft returns from excel files so I do this a rather odd way. I take the excel file into InDesign as a table, and then use find/change to replace any soft returns with nothing, then convert the text to table and then export the text out again by going export, and selecting text from the dropdown menu.
For my money, I always save tab delimited text files from excel so that if a field does contain commas, it doesn't "trick" indesign into thinking a new field is beginning or not... instead the field delimiters are tabs and they are unlikely to have been used in the excel database.
If you do choose to use this indesign import method of mine to clean up the database, i also noticed two things in your screengrab: first was that some fields have spaces at the start of the text... easy enough to fix with a GREP that looks for ^\s (start of a sentence followed by a space) and replace with nothing. The second thing is the T&C field that all entries (at least in the screengrab) all start the same – if all entries in the database start the same, couldn't that line be in the indesign file? Its only a small detail I know. -
How to load date and time from text file to oracle table through sqlloader
hi friends
i need you to show me what i miss to load date and time from text file to oracle table through sqlloader
this is my data in this path (c:\external\my_data.txt)
7369,SMITH,17-NOV-81,09:14:04,CLERK,20
7499,ALLEN,01-MAY-81,17:06:08,SALESMAN,30
7521,WARD,09-JUN-81,17:06:30,SALESMAN,30
7566,JONES,02-APR-81,09:24:10,MANAGER,20
7654,MARTIN,28-SEP-81,17:24:10,SALESMAN,30my table in database emp2
create table emp2 (empno number,
ename varchar2(20),
hiredate date,
etime date,
ejob varchar2(20),
deptno number);the control file code in this path (c:\external\ctrl.ctl)
load data
infile 'C:\external\my_data.txt'
into table emp2
fields terminated by ','
(empno, ename, hiredate, etime, ejob, deptno)this is the error :
C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 5
C:\>any help i greatly appreciated
thanks
Edited by: user10947262 on May 31, 2010 9:47 AMload data
infile 'C:\external\my_data.txt'
into table emp2
fields terminated by ','
(empno, ename, hiredate, etime, ejob, deptno)Try
load data
infile 'C:\external\my_data.txt'
into table emp2
fields terminated by ','
(empno, ename, hiredate, etime "to_date(:etime,'hh24:mi:ss')", ejob, deptno)
this is the error :
C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 5
C:\>
That's not an error, you can see errors within log and bad files. -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Loading records from .csv file to SAP table via SAP Program
Hi,
I have a .csv file with 132,869 records and I am trying to load it to an SAP table with a customized SAP program.
After executing the program, only 99,999 records are being loaded into the table.
Is there some setting to define how many records can be loaded into a table? Or what else could be the problem?
Pls advice.
Thanks!!!hi Arun ,
A datasource need a extract structure to fetch data .It is nothing but a temp table to hold data.
First you need to create atable in SE11 with fields coming from CSV file.
Then you need to write a report program to read you CSV file and populate your table in BW .
Then you can create a datasource on top of this table .
After that replicate and load data at PSA and use to upper flow.
Regards,
Jaya Tiwari -
How to refer/store a vaue from csv file in control file
Hi,
Consider the following control file script.
Load data
infile 'suv.csv'
append into table mast_equipmnet_test
fields terminated by "," optionally enclosed by '"'
TRAILING NULLCOLS
equipment_id,
sub_vehicle_type,
ebiz_carrier_no expression "(select ebiz_carrier_no from mast_carrier where carrier_id=?)",
licence_no,
equip_type,
ebiz_appown_no,
ebiz_equip_no sequence(1,1)
here is my csv file
CABNO, SUBTYPE CARRIER_ID REG_NO, VEHICLE_TYPE, EBIZ_APPOWN_NO
6954, SUMO, SWAMY, 6954, SUV, 228
9183, SUMO, SWAMY, 9183, SUV, 228
3173, QUALIS, SWAMY, 3173, SUV, 228
In my csv file i have carrier_ids which are string values in 3rd column.
for every carrier_id ,the corresponding ebiz_carrier_no(numeric value) is stored in a master table called "mast_carrier".
While loading the data i need to fetch the ebiz_carrier_no for each carrier_id from mast_carrier table .
but here i got strucked in the where clause of select statement.
I am not able to refer the carrier_id from csv file in where clause.
can any body tell me how to refere a value from csv file in the select statement of control file script.
cheers
RRKSorry..
"EXPRESSION" is not needed..
ebiz_carrier_no "(select ebiz_carrier_no
from mast_carrier
where carrier_id=:ebiz_carrier_no )",
<br>
<br>
"Tested" as
<br>
load data
infile *
into table t truncate
fields terminated by ','
(id,
name "(select ename from emp
where empno = :name)"
begindata
1,7900
2,7902
<br>
QL> select * from t;
ID NAME
1 JAMES
2 FORD -
How to load the data from .csv file to oracle table???
Hi,
I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
Thanks in advance981145 wrote:
Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
the command is
sqlldrType it and see if you have it installed.
Have a look also at the FAQ link posted by Marwin.
There are plenty of examples also on the web.
Regards.
Al -
Automatic Deprovisioning of AD resource Accounts from CSV file attribute
My scenario is somewhat like this,
I have a CSV flatfile Active Sync which contains the following columns:
accountId,firstname,lastname,department,location,region
ausmani,Arsalan,Bhagwan,Uphone,Milpitas,US
aahmed,Aftab,Singh,Telenor,Cairo,EMEA
hkhan,Hello,Khan,Lahore,Dublin,EMEA
I have created a role and has assigned AD resource to it. I have hardcoded this role in the waveset.roles field name in my creation form.
When I start FlatFileActiveSync, these above mentioned 3 accounts are created in IDM and they are also assigned AD role, and hence they are automatically provisioned to AD as, due to the fact that I am assigning resource on role base.
I am using Update User workflow in my poll-workflow configuration in my Flatfile synchronization policy.
Currently I am able to automatically provision a account from CSV file towards IDM and towards AD. All this process is automatic.
My problem is that how can I automatically disable and deprovision accounts via a CSV. What I should include in CSV that IDM will know that this account has to be disable and deprovision from resource??? Moreover, which workflow I have to use?
Thanks,
Farhan.Even I am struck at the same place. Please let me know if you find out.
Thanks you in advance
Prabhu -
Import From CSV File statement runs forever, no error, does not finish
Hello,
I am trying to import a CSV file in a JAVA program, with the following statement:
IMPORT FROM CSV FILE '/debug/testdatabase/FILE.csv'
INTO "JOSEPH"."TEST_TABLE"
WITH COLUMN LIST IN FIRST ROW
RECORD DELIMITED BY '\n'
FIELD DELIMITED BY '\t' ERROR LOG '/debug/testdatabase/file.err'
THREADS 10
BATCH 10000
I have two HANA instances on different machines A and B:
Both machines run HANA version 1.00.74.00.389160 (NewDB100_REL), while the OS is
SUSE Linux Enterprise Server 11.1 on machine A and
SUSE Linux Enterprise Server 11.2 on machine B.
The statement above runs fine on machine A and the rows are imported properly from JAVA as well as when executed from HANA Studio SQL console.
If I copy the file to machine B and try the exact same statement with the same file, it does not finish (neither from JAVA nor from HANA Studio SQL console). There is no error either. It cannot be cancelled, only a HANA restart stops the statement. Also the sample file I use has only 2 rows, and memory does not seem to be a problem.
I seem to have a similar problem to the one described here, but the answers there do not help me: http://scn.sap.com/thread/3396582 I specified the record delimiter, and I used a python script to check for any strange characters that are not supposed to be there, but didn't find any.
If I copy the file to my windows PC and use the "File Menu -> Import -> SAP Hana Content -> Data from Local file" function, it imports the file correctly into B, but I need to be able to do it from JAVA.
Machine A administration view:
Machine B administration view:
If you have any idea what might cause this behavior or where I can find more information on this problem please give me a hint.Hi Joseph,
First from the pics, the revision of your SAP HANA instance is 73 instead of 74. Since I have no identical environment, I cannot test it for you. But can you try the simplest scenario? You can create a table with only one column table try to import a CSV file with only one row.
Best regards,
Wenjun
Maybe you are looking for
-
Domicilio Fiscal , UF e Codigo de Pais
Prezados, Estamos com a seguinte situacao: Cliente está na ECC 6.0 apos uma migracao meramente técnica, mas todas as configuracoes e cadastros (impostos, e etc.) estao na época da 3.0F, e obviamente usam TAXBRJ. Vi todas as documentações e dicas ref.
-
Anyone having issues with playback on an external monitor in the new CC 2014?
So Premiere has updated its little self on my mac and now won't work with my external monitor via HDMI or Apple TV. Works absolutely fine the rest of the time but as soon as I try and get external playback to work it crashes. Is anyone else having th
-
Reg. Automatic payment program - URGENT!
Hi all, We are creating a validation for the F110. Can somebody tel me where do i enter my header data like th reference and other details? Is there a way to change my parameter screen layout? Kindly guide me on this. Regards Karpagam
-
Restriction on Cisco router DPC3825
Hello I shared my internet connection by a router (Cisco DPC3825 DOCSIS 3.0 Gateway )with my roommates, but someone download or may watch the movies and it takes all bandwidth, and also it goes over usage of my internet package every month. So is the
-
Possible to add scrollPane to entire JFrame?
hi, I have a frame that consists of couple of images, labels and other stuff. I've set the default size to be small so when the application comes up first..not everything is seen. now the user has to maximize the swing frame in order to see everythin