Save CSV in Unicode
Hi,
I have the following code:
private void button3_Click(object sender, EventArgs e)
SaveFileDialog sf = new SaveFileDialog();
sf.Filter = "CSV file (*.csv)|*.csv| All Files (*.*)|*.*";
if (sf.ShowDialog() == DialogResult.OK)
string savePath = Path.GetDirectoryName(sf.FileName);
File.WriteAllText(sf.FileName, richTextBox1.Text);
which works fine, but when I look at the document in excel I'm getting weird characters for & etc. I need to use unicode (I believe that the one I need). Any ideas how I will get that done?
If this is helpful please mark it so. Also if this solved your problem mark as answer.
@ TenPart,
I am glad to know you solved this problem and thanks for sharing the solution.
It will be very beneficial for other community members who have the similar questions.
Best regards,
Kristin
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.
Similar Messages
-
How to save CSV file in application server while making infospoke
How to save CSV file in application server to be used as destination while making infospoke.
Please give the steps.........Hi
If you want to load your flatfile from Application server,then you need to trasfer your file from your desktop(Computer) to Application server by using FTP.
Try using ARCHIVFILE_CLIENT_TO_SERVER Function module.
You Just need to give thesource path and the target path
goto SE37 t-code , which is Function module screen, give the function name ARCHIVFILE_CLIENT_TO_SERVER, on click F8 Execute button.
for path variable give the file path where it resides like C:\users\xxx\desktop\test.csv
for target path give the directory path on Application server: /data/bi_data
remember the directory in Server starts with /.
U have to where to place the file.
Otherwise use 3rd party tools to connect to ur appl server like : Core FTP and Absolute FTP in google.
Otherwise...
Goto the T.code AL11. From there, you can find the directories available in the Application Server.
For example, if you wanna save the file in the directory "DIR_HOME", then you can find the path of the directories in the nearby column. With the help of this, you can specify the target path. Specify the target path with directory name followed by the filename with .CSV extension.
Hope this helps
regards
gaurav -
Hi,
Im using the following javascript statement to save the web query result into CSV file.
window.open(SAP_BW_URL_Get()+
"&DATA_PROVIDER=DATAPROVIDER_1&CMD=EXPORT&FORMAT=CSV");
At present the data is saved into the default file generated by SAP BW server.
(example: SAP45B1PWBNRIC5DG8L131GG5GI2.csv)
Is there any way to save the query result into my own file?.
i..e I want to give my own file name in a the javascript or in webtemplate to store the query result.
Please help me on this.
Regards
KandasamyIf Matt's post doesn't help, consider asking in the Office:Mac forum hosted by Microsoft:
Office for Mac forums
Be sure to post your Mac OS version and your version of Offie when you post there. It's a very useful forum. -
hey all,
for using data merge in indesign i need a simple unicode .csv
but numbers only exports unicode-8-utf but a only "unicode" export is not available.
thanks.utf-8 is already "Unicode" in every sense of the word. I suppose what you want is the utf-16 variety? One way to get that would be to export as utf-8 and then open the file with a text editor like TextEdit or TextWrangler and resave as utf-16.
-
Hi,
Any method to resolve "STAThreadAttribute" error when new saveas dialog open? i am using vb.net.Hi Yin Kuan Loke,
There are some thread on the SCN which explain how to do. One of the most recent one is http://scn.sap.com/thread/3431919.
Regards,
Eric -
Are there any versions of Excel (chinese, japanese, russian... 2003, 2007, 2010...) that can save CSV files in Unicode (either UTF-8 or UTF-16)?
If not, is the only solution to go with tab-delimited files (save as Unicode-text option)?Hi Mark,
I have the same problem. Trying to save my CSV file in UTF8 encoding. After several hours in searching and trying this also in my VSTO Add-In I got nothing. Saving file as Unicode option in Excel creates file as TAB separated. Because I'd like to save the
file in my Add-In application, the best to do is (for my problem) saving file as unicode tab delimited and then replacing all tabs with commas in the file automatically.
I don't think there is a direct way to save CSV as unicode in Excel. And I don't understand why. -
User upload from a csv file via users_gen
Hello SRM Guru,
I am uploading users via transaction users_gen from a csv file, all the users are uploaded correctly but I see that there are some special charater (which we have in german language) are not cpoied correctly for example the last name is copied as
B#hm instead of Böhm. Although my logon language was German only.
Any help is appreciated !!!
Thanks,
JackHello,
Right click on your file, choose "open with", choose notepad. Then in notepad, "save as" and in encoding choose unicode.
But it's just a quick trick to convert your file in an unicode one. There should be an option in the editor you use to directly save csv in unicode encoding.
P. -
Bug: IRR Save As CSV doesn't filter non-displ cols, limits to 65535 rows
Hi all,
if've found 2 things regarding Interactive Report Regions and SaveAs CSV, where at least the second one is a bug, the first one could be a security-feature...
here we go:
1) If you display more columns in your Interactive Report than you would export to CSV (using condition request!=CSV on report column), than a filter applied on a not exported column isn't used in the export at all. E.g. i display sales data and filter only sales of country "austria" i'll still get all country sales in my export when the country-column itself isn't exported.
2) my interactive report region shows ~86.000 rows, the maximum number of rows is set to 100.000, still the save as CSV exports only 65.535 rows.
I know that earlier versions of excel were limited to 65k rows, but still i would like to write unlimited number of rows to a CSV File. I don't necessarily use this file in excel.
PeterDippy,
The 65k row limitation is built into the IR setup. The ONLY way you can get around it is to add a custom download routine to your IR. You could look at building a routine that would grab the SQL for your IR and execute it in a pl/sql process and download the csv that way (similar to the existing APEX 4.0 plugin..)
In that way you would only write 1 routine, but add it to all your IR Reports..
Thank you,
Tony Miller
Webster, TX
While it is true that technology waits for no man; stupidity will always stop to take on new passengers.
If this question is answered, please mark the thread as closed and assign points where earned.. -
Using cffile to make a unicode textfile
Hi, I am trying to make a unicode text file from a query, I
can make the file as a .csv file no problem using chr(34) as the
field seperateor character, and all works fine, However I want to
create a unicode .txt file that is identical to the same as the
file created by excel when I save a .csv as a unicode .txt. After
looking around in the cfforum I think I have learnt that I need to
use charset=utf-8 attribute in cffiel, but I am not sure what I
need to use as the seperator character (ie what does excel use?
when it saves as a unicode txt file) many thanks
OliYou can try the code in the following example.
http://www.cflib.org/udf.cfm?id=1197&enable=1
In order to save the file in Unicode you can use such as
following syntax.
<cffile action="write" charset="utf-8"
file="#myPath#unicodeFile.txt" output="#myUnicodeContent#" /> -
Saving .csv into internal table - using dataset (',' comes between data)
Hi experts,
I need to save .csv from application server to internal table.
i am using the below code.
gt_raw and gwa_raw are dxrawdata format.
OPEN DATASET gv_pfile FOR INPUT IN TEXT MODE ENCODING DEFAULT.
*--- Display error messages if any.
IF sy-subrc NE 0.
WRITE:/ 'FILE UPLOAD FAILED - ERROR NO. : ', sy-subrc.
EXIT.
ELSE.
DO.
READ DATASET gv_pfile INTO gwa_raw.
IF sy-subrc NE 0.
EXIT.
ELSE.
APPEND gwa_raw TO gt_raw.
CLEAR gwa_raw.
ENDIF.
ENDDO.
*--Close the Application server file (Mandatory).
CLOSE DATASET gv_pfile.
ENDIF.
DELETE DATASET gv_pfile.
LOOP AT gt_raw into gwa_raw.
IF SY-TABIX > 1.
SPLIT gwa_raw at ',' into gwa_cust-cust_code
gwa_cust-cust_name
gwa_cust-grp_name
APPEND gwa_cust TO gt_cust.
CLEAR: gwa_cust, gwa_raw.
ENDIF.
ENDLOOP.
My program works fine.
But when the gwa_cust-grp_name contains the value for eg. -> panasonic co., ltd.
it takes till panasonic co., only
and leaves ltd. as i am using SPLIT command.
is there any other way to do this.
plz help me to solve this issue.
thanks.Hi,
I notice you have marked the message as answered, but I just wanted to let you know there is a solution. The trick is to parse into an internal table and then to find and reassemble fields that were split because they contgain a comma. The ABAP program below is a commented example.
Rgds,
Mark
REPORT zcsv_parse.
DATA:
tokens TYPE i.
TYPES: BEGIN OF ty_result,
company TYPE char20,
compnr TYPE i,
city TYPE char30,
country TYPE char30,
END OF ty_result.
DATA:
gt_rawtab TYPE TABLE OF string,
gw_rawtab LIKE LINE OF gt_rawtab,
gt_result TYPE TABLE OF ty_result,
gw_result LIKE LINE OF gt_result,
gt_parse TYPE TABLE OF string,
gw_parse LIKE LINE OF gt_parse.
DEFINE %csvline.
gw_rawtab = &1.
append gw_rawtab to gt_rawtab.
END-OF-DEFINITION.
START-OF-SELECTION.
* Create CSV lines, some with a comma inside a token
%csvline '"CompanyOne NV",500,"Antwerp","Belgium"'.
%csvline '"CompanyTwo,Inc",600,"New York,NY","USA"'.
%csvline '"CompanyThree,Ltd",700,"Sydney,NSW","Australia"'.
* Parse the raw CSV
LOOP AT gt_rawtab INTO gw_rawtab.
REFRESH gt_parse.
SPLIT gw_rawtab AT ',' INTO TABLE gt_parse.
DESCRIBE TABLE gt_parse LINES tokens.
* If extra commas: token count higher than field count
IF tokens > 4.
PERFORM reassemble.
ENDIF.
* At this point each entry in GT_PARSE contains exactly
* one result field => build the result table
LOOP AT gt_parse INTO gw_parse.
* Strip quotes from text fields
REPLACE ALL OCCURRENCES OF '"' IN gw_parse WITH ''.
CASE sy-tabix.
WHEN 1. gw_result-company = gw_parse.
WHEN 2. gw_result-compnr = gw_parse.
WHEN 3. gw_result-city = gw_parse.
WHEN 4. gw_result-country = gw_parse.
ENDCASE.
ENDLOOP.
APPEND gw_result TO gt_result.
ENDLOOP.
* Show the formatted result
LOOP AT gt_result INTO gw_result.
WRITE: / gw_result-company, gw_result-compnr,
gw_result-city, gw_result-country.
ENDLOOP.
*& Form reassemble
* Merges tokens that were split because they contain a comma
FORM reassemble.
DATA: lastpos TYPE i,
lastchar TYPE c,
currtoken LIKE sy-tabix,
nexttoken LIKE sy-tabix,
gw_next LIKE gw_parse.
LOOP AT gt_parse INTO gw_parse.
lastpos = STRLEN( gw_parse ) - 1.
lastchar = gw_parse+lastpos(1).
* Token starts with quote but does not end with one =>
* must merge with the next token
IF gw_parse+0(1) = '"' AND lastchar <> '"'.
currtoken = sy-tabix.
nexttoken = sy-tabix + 1.
READ TABLE gt_parse INTO gw_next INDEX nexttoken.
CONCATENATE gw_parse gw_next INTO gw_parse SEPARATED BY ','.
MODIFY gt_parse FROM gw_parse INDEX currtoken.
DELETE gt_parse INDEX nexttoken.
ENDIF.
ENDLOOP.
ENDFORM. "reassemble -
How to convert the local file into unicode file?
Hi All,
I need read local file (GUI_UPLOAD) and save it as unicode file?
I've found class CL_ABAP_CONV_OUT_CE but I've no idea how to use it.
Should I read file in binary mode and then convert it into unicode? AT the and save it (GUI_DOWNLOAD) ?
Thanks
AdamHi,
Check these classes, will help you
CL_ABAP_CONV_IN_CE : Reading binary data
CL_ABAP_CONV_OUT_CE : exporting binary data
CL_ABAP_CONV_X2X_CE : reading and exporting binary data and changing the format
Regards,
Satish -
With RoboHelp 7, does RoboHelp for Word now support Unicode
and therefore most non-western languages, i.e. Chinese, Japanes,
Russian, etc.?
In the online Help of RoboHelp for Word 7(trail version), the
following statement can be found under "What's new in Adobe
RoboHelp 7":
Support for Unicode
Create content in multiple languages, in the same RoboHelp
project. Topic, index, TOC, and glossary can have content in
multiple languages. You can import, open, save, and publish
Unicode-encoded content.
I am not sure if this statement is also refering to the
WinHelp output of Robohelp for Word as it is known that Micorsoft's
WinHelp compiler (HCW: Help Compiler Workshop), which RoboHelp for
Word possibly still uses, has not been changed for 10 years. HCW
supports DBCS but definitely not Unicode.
It would of course be a nice supprise if Adobe have actually
managed to make the Unicode encodiing work in RoboHelp for Word.
Could you give more details on this?
I have performed some tests with a Japanese WinHelp 4 project
and the result looks good. I think you have to install RH7 on a
Japanese Windows system. I find running RH7 on an English Windows
with the Japanese Language files installed isn't enough.Interesting question.
I am certainly one of those who would go the HTML route as
the HTML code created there is much cleaner and less prone to
issues than HTML created by Word. Also you can get at the HTML and
do things in the help that are not possible when working with Word.
However, you have hit the nail on the head when you mention future
authors in your company. Increasingly I believe that being able to
work with an HTML tool, which is not the same as needing to have a
good understanding of HTML, will become a basic expectation of
anyone seeking a technical authoring role, if it is not already.
But there are many organisations where the job is part time and
knowledge of Word is all that can be expected. Personally I would
go the HTML route and force the issue. It should not take too much
to get someone else to the point where they can follow on.
When I first created help I used RH HTML I started by using
Word as the editor. I hit some problems and was persuaded to use
the HTML editor. I got help along the way and now I would not
consider anything else. If I can do it, anyone can.
If you are going to import from a project created using Word,
do take a look at the topic on my site about importing from Word. -
Download Reports in CSV or pdf format
Hi to all,
i'm trying to download a report in CSV format that implicate the list of more than 400.000 users and their resource assigned.
As i know, Sun IDM before generating this file is going to execute a task that return the list of this 400.000 users, and only after
that IDM ask me with a window alert if i want to only to "View" or if i want to "Save" the CSV file.
I noticed that in the moment that IDM is asking me if i want to save and where to save, i wasn't logged in IDM, so
i'm able to see only the task that is executed with success and the list of users in the task but any CSV or request to save CSV file.
Do you think that i need an open IDM session with the same IDM user that i execute the report in the moment to save the CSV?
And what about the task that was already executed: how can i download the CSV without executing another time the task?
Thanks in advance
Best RegardsI noticed that in the moment that IDM is asking me if i want to save
and where to save, i wasn't logged in IDM, so i'm able to see only the
task that is executed with success and the list of users in the task
but any CSV or request to save CSV file.The following method converts a finished report to a string in CSV format:
public static String printReportsAsCSV(String taskName, String sep, Locale locale)
throws WavesetException {
TaskManager tm = Server.getServer().getTaskManager();
RepositoryResult er = tm.getExtendedResult(taskName, null);
if(er != null && er.hasNext()) {
ReportRenderer rend = new ReportRenderer(locale);
StringBuffer sb = new StringBuffer();
while(er.hasNext()) {
TaskResult tr = (TaskResult)er.next().getObject();
WavesetResult res = tr.getResult();
if(res != null && res.getResults() != null && res.getResults().size() > 0) {
Report report = (Report)res.getResult("report");
rend.renderToSeparatedText(report, sb, sep);
return sb.toString();
return null;
}Usage example:
System.out.println(printReportsAsCSV("All Admin Roles", ",", Locale.getDefault()));The report is stored in the repository as a TaskResult object, so you
can also export it in XML form and process with XSL or something.
Cheers,
Vladimir -
SQL bulk copy from csv file - Encoding
Hi Experts
This is the first time I am creating a PowerShell script and it is almost working. I just have some problems with the actual bulk import to SQL encoding from the text file since it replaces
special characters with a question mark. I have set the encoding when creating the csv file but that does not seem to reflect on the actual bulk import. I have tried difference scenarios with the encoding part but I cannot find the proper solution for that.
To shortly outline what the script does:
Connect to Active Directory fetching all user - but excluding users in specific OU's
Export all users to a csv in unicode encoding
Strip double quote text identifiers (if there is another way of handling that it will be much appreciated)
Clear all records temporary SQL table
Import records from csv file to temporary SQL table (this is where the encoding is wrong)
Update existing records in another table based on the records in the temporary table and insert new record if not found.
The script looks as the following (any suggestions for optimizing the script are very welcome):
# CSV file variables
$path = Split-Path -parent "C:\Temp\ExportADUsers\*.*"
$filename = "AD_Users.csv"
$csvfile = $path + "\" + $filename
$csvdelimiter = ";"
$firstRowColumns = $true
# Active Directory variables
$searchbase = "OU=Users,DC=fabrikam,DC=com"
$ADServer = 'DC01'
# Database variables
$sqlserver = "DB02"
$database = "My Database"
$table = "tblADimport"
$tableEmployee = "tblEmployees"
# Initialize
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
# GET DATA FROM ACTIVE DIRECTORY
# Import the ActiveDirectory Module
Import-Module ActiveDirectory
# Get all AD users not in specified OU's
Write-Host "Retrieving users from Active Directory..."
$AllADUsers = Get-ADUser -server $ADServer `
-searchbase $searchbase -Filter * -Properties * |
?{$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com' `
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'}
Write-Host "Users retrieved in $($elapsed.Elapsed.ToString())."
# Define labels and get specific user fields
Write-Host "Generating CSV file..."
$AllADUsers |
Select-Object @{Label = "UNID";Expression = {$_.objectGuid}},
@{Label = "FirstName";Expression = {$_.GivenName}},
@{Label = "LastName";Expression = {$_.sn}},
@{Label = "EmployeeNo";Expression = {$_.EmployeeID}} |
# Export CSV file and remove text qualifiers
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
Write-Host "Removing text qualifiers..."
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Write-Host "CSV file created in $($elapsed.Elapsed.ToString())."
# DATABASE IMPORT
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
$batchsize = 50000
# Delete all records in AD import table
Write-Host "Clearing records in AD import table..."
Invoke-Sqlcmd -Query "DELETE FROM $table" -Database $database -ServerInstance $sqlserver
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable and autogenerate the columns
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumns -eq $true) { $null = $reader.readLine()}
Write-Host "Importing to database..."
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
# Clean Up
Write-Host "CSV file imported in $($elapsed.Elapsed.ToString())."
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
# Update tblEmployee with imported data
Write-Host "Updating employee data..."
$queryUpdateUsers = "UPDATE $($tableEmployee)
SET $($tableEmployee).EmployeeNumber = $($table).EmployeeNo,
$($tableEmployee).FirstName = $($table).FirstName,
$($tableEmployee).LastName = $($table).LastName,
FROM $($tableEmployee) INNER JOIN $($table) ON $($tableEmployee).UniqueNumber = $($table).UNID
IF @@ROWCOUNT=0
INSERT INTO $($tableEmployee) (EmployeeNumber, FirstName, LastName, UniqueNumber)
SELECT EmployeeNo, FirstName, LastName, UNID
FROM $($table)"
try
Invoke-Sqlcmd -ServerInstance $sqlserver -Database $database -Query $queryUpdateUsers
Write-Host "Table $($tableEmployee) updated in $($elapsed.Elapsed.ToString())."
catch
Write-Host "An error occured when updating $($tableEmployee) $($elapsed.Elapsed.ToString())."
Write-Host "Script completed in $($elapsed.Elapsed.ToString())."I can see that the Export-CSV exports into ANSI though the encoding has been set to UNICODE. Thanks for leading me in the right direction.
No - it exports as Unicode if set to.
Your export was wrong and is exporting nothing. Look closely at your code:
THis line exports nothing in Unicode"
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
There is no input object.
This line converts any file to ansi
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Set-Content defaults to ANSI so the output file is converted.
Since you are just dumping into a table by manually building a recorset why not just go direct. You do not need a CSV. Just dump theresults of the query to a datatable.
https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
This script dumps to a datatable object which can now be used directly in a bulkcopy.
Here is an example of how easy this is using your script:
$AllADUsers = Get-ADUser -server $ADServer -searchbase $searchbase -Filter * -Properties GivenName,SN,EmployeeID,objectGUID |
Where{
$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com'
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'
} |
Select-Object @{N='UNID';E={$_.objectGuid}},
@{N='FirstName';Expression = {$_.GivenName}},
@{N='LastName';Expression = {$_.sn}},
@{N=/EmployeeNo;Expression = {$_.EmployeeID}} |
Out-DataTable
$AllDUsers is now a datatable. You can just upload it.
¯\_(ツ)_/¯ -
To show the result of 2 grids in a single csv file
Hi,
i want to show 2 grids(seperately) in a csv file.i created a aggregate query that is giving me result of 2 grid seperately but when i use a igrid template with aggregate query it is giving me result in a single row.i.e instead of giving 2 grids it is giving result with all the columns in a single row .I am using Applet.saveas CSV File();is there any way to have 2 seperate grid in a single Excel file.
thanks in advance.Karthik
Aggregate Query is mostly used to combine multiple queries which have at least one common column among those.
For ur requirement, u can use direct BLS with the help of Action Block <b>WriteFile</b>.
<b>Scenario :</b> u hv 2 queries - one with 4 columns A, B, C & D; another with 2 columns E & F
U want to display those tables separately in one single Excel Sheet.
<b>Solution :</b> Forget AggregateQuery and Take the following steps
Step 1: Take those two Queries in one sequence
Step 2: Define two Local variables - Text1 & Text2 with String Data Type
Step 3: Use Repeater to loop one query's result and Assign the Looping results to the Local.Text1 using action <b>Assignment</b> as following expression in the Link Editor
<b>Local.Text1 &
Repeater_0.Output{/Row/A} & tab &
Repeater_0.Output{/Row/B} & tab &
Repeater_0.Output{/Row/C} & tab &
Repeater_0.Output{/Row/D} & crlf
</b>
Step 4: Use another Repeater outside the first Repeater to loop 2nd query's result and Assign the looping results to the Local.Text2 in similarly way as follows in the Link Editor
<b>
Local.Text2 &
Repeater_1.Output{/Row/E} & tab &
Repeater_1.Output{/Row/F} & crlf
</b>
Step 5: Use action <b>WriteFile</b> outside those above Repeaters and take the mode <b>APPEND</b> and FilePath <b>C:
test
TwoTables.xls</b> and Text as follows
<b>
"A" & tab & "B" & tab & "C" & tab & "D" & crlf & Local.Text1 & crlf & crlf &
"E" & tab & "F" & crlf & Local.Text2
</b>
in the Link Editor
Step 6: Hit F5/F6 and see the Excel File with two tables showing separately in the mentioned path above.
Regards
Som
Maybe you are looking for
-
How to use a clipping mask layer to a multiple base layers?
Is it possible to use one layer as a clipping mask to a 5 different bottom layers. What I'm trying to do is apply an adjustment layer to certain layers only. Is that possible? -Zeropan
-
Can Oracle Bulkcopy write data within an external transaction?
Hi Everyone, I am new to odp.net and I hope somebody can help me with an issue about bulkcopy. What I want to do is to make the process of a bulkcopy executed within a transaction. The transaction here should be external and when it comes to the poin
-
Error while trying to change settings in SALE
Hi I am trying to trigger Idocs from QAS, ECC to XI. There are some problems with the settings. I went to SALE -.> Logical System --> Define Logical System and reach " Display Logical Systems OverView screen. Here under the Logical Systems, I don't
-
How do I transfer my music from my external hard drive with the extension .itl?
I transferred my music from my old computer to an external hard drive. Now trying to get my music to load into my new computer via external hard drive but I do not think it will due to the extension of .itl. what shall I do?
-
IPhone photo sync doesn't work.
Since the last iOS update the photo sync in iTunes doesn't work. One event will be shown as many in iTunes, each with only a few photos or even just one in it. In iPhoto all events are ok. Any suggestion why and how to solve it?