Make fails while building cldc 1.1 on Linux
Dear All,
Following is the software config:
javac : "1.4.2_08"
linux version: redhat linux 9
kernel: 2.4.20-8 on an i686
I am building cldc 1.1 on linux. I am facing a strange problem. After doing a make, the build process stops after these lines
====>
../tools/preverifier/build/linux/preverify -d classes tmpclasses
make[1]: *** [compilefiles] Error 1
make[1]: Leaving directory `/users/in1222c/cldc1/api'
make: *** [all] Error 1
<====
All java files (including custom files that I have included), get compiled to the corresponding class files, but just after the preverification message appears, the build gives an error as above.
Anyone faced this problem yet? If yes, what is that happenning wrong here.
I will be glad for your comment and support on this.
Regards,
Hrishi.
Dear All,
Following is the software config:
javac : "1.4.2_08"
linux version: redhat linux 9
kernel: 2.4.20-8 on an i686
I am building cldc 1.1 on linux. I am facing a strange problem. After doing a make, the build process stops after these lines
====>
../tools/preverifier/build/linux/preverify -d classes tmpclasses
make[1]: *** [compilefiles] Error 1
make[1]: Leaving directory `/users/in1222c/cldc1/api'
make: *** [all] Error 1
<====
All java files (including custom files that I have included), get compiled to the corresponding class files, but just after the preverification message appears, the build gives an error as above.
Anyone faced this problem yet? If yes, what is that happenning wrong here.
I will be glad for your comment and support on this.
Regards,
Hrishi.
Similar Messages
-
Getting Error of SqlBuildTask fail while building the database project
Hi Experts,
I am getting error while building the database project in following manner please help on this.
Error 803 04018: The "SqlBuildTask" task failed unexpectedly.
System.NullReferenceException: Object reference not set to an instance of an object.
at Microsoft.Data.Tools.Schema.UserInteractionServices.GetElementName(IModelElement element, ElementNameDetails details)
at Microsoft.Data.Tools.Schema.Sql.SqlUserInteractionServices.GetElementName(IModelElement element, ElementNameDetails details)
at Microsoft.Data.Tools.Schema.Sql.Sql110UserInteractionServices.GetElementName(IModelElement element, ElementNameDetails details)
at Microsoft.Data.Tools.Schema.Sql.Validation.MismatchedNameRule.Analyze(SqlSchemaModel sqlSchemaModel, ISqlModelElement sqlElement, SqlRuleSetting ruleSetting, SqlRuleExecutionContext context)
at Microsoft.Data.Tools.Schema.Sql.Validation.SqlValidationRuleProxy.Analyze(SqlSchemaModel sqlSchemaModel, ISqlModelElement sqlModelElement, SqlRuleSetting ruleSetting, SqlRuleExecutionContext context)
at Microsoft.Data.Tools.Schema.Sql.Validation.SqlValidationRule.Analyze(SqlRuleSetting ruleSetting, SqlRuleExecutionContext context)
at Microsoft.Data.Tools.Schema.Sql.RuleEngine.SqlRuleEngine.ExecuteRuleImpl(SqlRuleSetting ruleSetting, SqlRuleExecutionContext context, Predicate`1 suppressProblem, IEnumerable`1& errors)
at Microsoft.Data.Tools.Schema.Sql.RuleEngine.SqlRuleEngine.ExecuteRules(SqlSchemaModel dataSchemaModel, IEnumerable`1 ruleSettings, IEnumerable`1& errors, Predicate`1 suppressProblem, Func`1 executeCanceled)
at Microsoft.Data.Tools.Schema.Sql.RuleEngine.SqlRuleEngine.ExecuteRules(SqlSchemaModel dataSchemaModel, IEnumerable`1 rules, IEnumerable`1& errors, Predicate`1 suppressProblem, Func`1 executeCanceled)
at Microsoft.Data.Tools.Schema.Sql.Build.SqlTaskHost.RunBuildValidation(SqlSchemaModel model, ErrorManager errorContainer, Func`1 buildCanceledQuery, HashSet`1 includedRuleIDs, HashSet`1 excludedRuleIDs)
at Microsoft.Data.Tools.Schema.Sql.Build.SqlTaskHost.OnRunBuildValidations(ErrorManager errorManager, HashSet`1 includedRuleIds)
at Microsoft.Data.Tools.Schema.Sql.Build.SqlTaskHost.RunBuildValidations(ErrorManager errorManager, HashSet`1 includedRuleIDs)
at Microsoft.Data.Tools.Schema.Tasks.Sql.TaskHostLoader.LoadImpl(ITaskHost providedHost, TaskLoggingHelper providedLogger, Boolean runAllBuildTimeValidationRules)
at Microsoft.Data.Tools.Schema.Tasks.Sql.TaskHostLoader.Load(ITaskHost providedHost, TaskLoggingHelper providedLogger, Boolean runAllBuildTimeValidationRules)
at Microsoft.Data.Tools.Schema.Tasks.Sql.SqlBuildTask.ExecuteLoadTaskHostStep()
at Microsoft.Data.Tools.Schema.Tasks.Sql.SqlBuildTask.ExecuteStep(Func`1 step)
at Microsoft.Data.Tools.Schema.Tasks.Sql.SqlBuildTask.Execute()
at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
at Microsoft.Build.BackEnd.TaskBuilder.<ExecuteInstantiatedTask>d__20.MoveNext() C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets 547 5 ABC_DB
Niraj SevalkarHi Niraj, does this happen for all projects or for one specific project? I'd love to understand if this is a specific bug or if it might be a setup issue.
If this issues is reproduceable on the latest SSDT bits, I'd suggest capturing the event log for what is going wrong and then opening a connect
bug for this issue at https://connect.microsoft.com/SQLServer/feedback/CreateFeedback.aspx and
use the category "Developer Tools(SSDT, BIDS, etc.)". We're trying to track all bugs through connect so that you can tell when we have fixed the issue and we can request
more information. Please include the event log (instructions on this below) plus any other useful information you can provide, such as an example of exact steps to reproduce the problem.
Gathering an Event Log for SSDT
For diagnostic purposes we would like you to gather an event log for the issue that you are experiencing in SSDT. In order to gather a log and send it to a member of
the team, please follow the steps below.
Open a new command prompt as Administrator.
Run the following command
logman create trace -n DacFxDebug -p "Microsoft-SQLServerDataTools" 0x800 -o "%LOCALAPPDATA%\DacFxDebug.etl" -ets
logman create trace -n SSDTDebug -p "Microsoft-SQLServerDataToolsVS" 0x800 -o "%LOCALAPPDATA%\SSDTDebug.etl" -ets
Run whatever the target/issue scenario is in SSDT.
Go back to the command prompt and run the following commands
logman stop DacFxDebug -ets
logman stop SSDTDebug -ets
The resulting ETL files will be located at %LOCALAPPDATA%\SSDTDebug.etl & %LOCALAPPDATA%\DacFxDebug.etl and can be navigated to using Windows Explorer.
Please attach this file when creating the connect bug
NOTE - These logs will only be used by Microsoft product team members in order to better diagnose the problem you are experiencing
with SSDT and will not be shared elsewhere. If you want to make sure that there is no private information in your ETL file, the SSDTDebug.etl file can be opened and analyzed using the Windows Event Viewer.
To do this, open the Windows Event Viewer application. In the right-hand panel, select Open Saved Log. Navigate to the location where you saved the log, open, and review
the contents of the trace.
Thanks,
Kevin -
Make failed while deploying loan flow
Hi,
when I try to deploy my loan flow sample through Jdeveloper to oracle bpel process manager 10.1.3.1.0
I am getting the following
"Beginning Deployment Process...
Compiling...
[6:54:51 PM] Compilation complete: 0 errors, 1 warnings.
Deploying to http://localhost:9700 domain: default. Please wait....[6:54:51 PM] Deployment failed.
Make failed, so nothing to deploy."
I will glad if some one could help me out.
Thanks
-SagarThanks for the reply.. I restarted and tried next day without making any changes. It deployed succesfully. Probably a rare bug.
I am not using SOA suite. I am using oracle bpel process manager 10.1.3.1.0 alone. Here I think the default is 9700. I checked out http.port= 9700.
And one more thing is bpel server integration status is shown to be ok with 9700 but ESB status is failed. Am I right in assuming that there is no ESB service in oracle bpel process manager 10.1.3.1.0 alone?. I think it is present only in the SOA suite. -
CBS Make failed while importing ESS
Hi Experts,
I am importing ESS package as per JDI cookbook. MSS and dependent packages imported fine but ESS package failed after 6 hrs as :
Info:Starting Step CBS-make at 2009-02-12 03:08:11.0390 -8:00
Info:wait until CBS queue of buildspace JDV_XSSTrack_D is completely processed, before starting the import
Info:buildspace = JDV_XSSTrack_D build request id = 6797
Info:wait until CBS queue of buildspace JDV_XSSTrack_D is completely processed, before asking for build results
Info:waiting for CBS queue activity
Info:maximal waiting time is temporally increased
Info:build process already running: waiting for another period of 30000 ms
Info:build process already running: waiting for another period of 30000 ms
Info:no changes on the CBS request queue (JDV_XSSTrack_D) after a waiting time of 21630000 ms
Fatal:The request queue is not processed by the CBS during the given time intervall => the import failed because not every request including follow-up requests were processed
Fatal:Please look after the operational status of the CBS.
Fatal:communication error: The request queue is not processed during the given time intervall. Please look after the operational status of the CBS.
Info:Step CBS-make ended with result 'fatal error' ,stopping execution at 2009-02-12 09:25:54.0687 -8:00
There are no other errors, actually import kept running for another 5 hrs ( from activity in temp/CBS folder). There are no other errors.
Any suggestions ? How is "waiitng time 21630000 ms" controlled?
In CBS buildspace information , there are 0 queued/processing/failed requests. **
thanks,
DJHi Daljit,
What is the Databse u are using.
Regards
Deb -
CBS Make failed while importing - Urgent
Hi...
i m following the JDI CookBook to import ESS MSS SCA files.
I have following issues.
1. I have placed all the SCA files, ESS MSS PCUI JBUILT and sap-jee, JTECH in the CMS inbox. but whn i go for check in .. JBUILT AND JTECH are not displayd in the dropdown.
i have successfully checked in sap-jee, ess, mss, pcui sca files and are listed in transport queue
2. i still went ahead with import without JTECH/BUILT files.. Repository import for sap-jee was successful but not the CBS make.
3. this is the error log when i import ess / mss sca files
the repository import itself fails
Info:Starting Step Repository-import at 2005-10-27 18:45:06.0859 +5:00
Info:Component:sap.com/SAP-JEE
Info:Version :NIGHTLYBUILD.20051017041900
Info:4. PR is of type TCSSoftwareComponent
Info:Component:sap.com/SAP_ESS
Info:Version :MAIN_xss04PAT_C.20050908103103
Info:3. PR is of type TCSSoftwareComponent
Fatal Exception:com.sap.cms.tcs.interfaces.exceptions.TCSCommunicationException: communication error: PropagationException received: Could not perform Import Request: Status = 500 Internal Server Error:communication error: PropagationException received: Could not perform Import Request: Status = 500 Internal Server Error
com.sap.cms.tcs.interfaces.exceptions.TCSCommunicationException: communication error: PropagationException received: Could not perform Import Request: Status = 500 Internal Server Error
at com.sap.cms.tcs.client.DTRCommunicator.writeChangelistData(DTRCommunicator.java:228)
at com.sap.cms.tcs.core.RepositoryImportTask.processRepositoryImport(RepositoryImportTask.java:260)
at com.sap.cms.tcs.core.RepositoryImportTask.process(RepositoryImportTask.java:500)
at com.sap.cms.tcs.process.ProcessStep.processStep(ProcessStep.java:77)
at com.sap.cms.tcs.process.ProcessStarter.process(ProcessStarter.java:169)
at com.sap.cms.tcs.core.TCSManager.importPropagationRequests(TCSManager.java:276)
at com.sap.cms.pcs.transport.importazione.ImportManager.importazione(ImportManager.java:212)
at com.sap.cms.pcs.transport.importazione.ImportQueueHandler.execImport(ImportQueueHandler.java:432)
at com.sap.cms.pcs.transport.importazione.ImportQueueHandler.startImport(ImportQueueHandler.java:99)
at com.sap.cms.pcs.transport.proxy.CmsTransportProxyBean.startImport(CmsTransportProxyBean.java:392)
at com.sap.cms.pcs.transport.proxy.LocalCmsTransportProxyLocalObjectImpl0.startImport(LocalCmsTransportProxyLocalObjectImpl0.java:1156)
at com.sap.cms.ui.wl.Custom1.importQueue(Custom1.java:1114)
at com.sap.cms.ui.wl.wdp.InternalCustom1.importQueue(InternalCustom1.java:1025)
at com.sap.cms.ui.wl.Worklist.onActionImportQueue(Worklist.java:840
Please help me on this as i m having this problem since long. i tried to import a couple of times.
any hint also will be rewarded..!!
regards
ashutoshI don't know if this will be much help, maybe if you could point me to the JDI Cookbook you're using...
Based on the stack trace that you've included, it seems that your client is trying to connect to the DTR and an error is occuring in the operation that the DTR is trying to perform for you.
It's hard for me to assess the full situation without more details, but hopefully that should narrow it down a little bit for you.
-Steve
If you find a post useful, please help keep the community going by setting a good example and rewarding the poster with points. -
Step CBS make failed while importing
Hi all,
i am getting following errors when i try to import SC SAPPCUI_GP or any other XSS SC.
SAP-JEE is imported successfully but SCA files of ESS, MSS and PCUI are not imported.
the error is
Info:wait until CBS queue of buildspace EPD_ESSTrack_D is completely processed, before starting the import
Info:waiting for CBS queue activity
Info:build process already running: waiting for another period of 30000 ms (1)
Info:build process already running: waiting for another period of 30000 ms (2)
Info:build process already running: waiting for another period of 30000 ms (24)
Error:CBS error on retrieving open build requests
caused by:Connection refused: connect (Service call exception; nested exception is:
java.net.ConnectException: Connection refused: connect)
com.sap.tc.cbs.client.error.CommunicationException: Connection refused: connect (Service call exception; nested exception is:
java.net.ConnectException: Connection refused: connect)
at com.sap.tc.cbs.client.impl.BuildSpace.listOpenRequests(BuildSpace.java:530)
at com.sap.cms.tcs.client.CBSCommunicator.getOpenBuildRequests(CBSCommunicator.java:1097)
at com.sap.cms.tcs.client.CBSCommunicator.isThereQueueActivity(CBSCommunicator.java:1477)
at com.sap.cms.tcs.client.CBSCommunicator.importRequest(CBSCommunicator.java:351)
at com.sap.cms.tcs.core.CbsMakeTask.processMake(CbsMakeTask.java:111)
at com.sap.cms.tcs.core.CbsMakeTask.process(CbsMakeTask.java:300)
i am getting this connection refused error everywhere now.
i m not able to make out which connection is this ??
is it connection to DB ??
Please help.
regards
ashutoshHi Ashutosh,
You may need to restart the CBS service using the Visual Admin.
Regards
Sidharth -
I updated air sdk from 4.0 to 16.0 beta to fix apple app store submitting bug. I can build the project and run it on simulator without any problem. But when I tried to export the ios release build, I got this error message: complilation failed while excuting : compile-abc.
I use commmand line to excute compile-abc.exe with the parameters that adt gives it, but it failed without any error message.
Here is the command line:
compile-abc.exe -mtriple=armv7-apple-ios -filetype=obj -sdk "C:\Program Files (x86)\Adobe\Adobe Flash Builder 4.5\sdks\4.5.1 - air16\lib\aot/lib/avmglue.abc" -fields "C:\Program Files (x86)\Adobe\Adobe Flash Builder 4.5\sdks\4.5.1 - air16\lib\aot/lib/air-fields.arm-air.txt" -O3 -abc-file-list=E:\MyApp\bin-debug\AOTBuildOutput8184169967790207636.tmp\ABCFilesList.txt
There's a empty file AOTBuildOutput-0000001821_1821.o left in the command line workaroud path. I opened file AOTBuildOutput-0000001821.abc, but didn't get any clue. How can I trace this problem?Still having similar issues - seems to be when I embed an image - but works for every release except for release build
Tried your work around but doesnt seem to make a different
[Embed(source = "/../assets/[email protected]")]
protected static const ATLAS_IMAGE:Class;
[Embed(source = "/../assets/[email protected]", mimeType = "application/octet-stream")]
protected static const ATLAS_XML:Class;
[Embed(source = "/../assets/iconMap.png")]
protected static const ATLAS_IMAGE_LOW_RES:Class;
[Embed(source = "/../assets/iconMap.xml", mimeType = "application/octet-stream")]
protected static const ATLAS_XML_LOW_RES:Class;
if(Starling.current.viewPort.width > 320)
atlas = new TextureAtlas(Texture.fromBitmap(new ATLAS_IMAGE(), false), XML(new ATLAS_XML()));
}else{
atlas = new TextureAtlas(Texture.fromBitmap(new ATLAS_IMAGE_LOW_RES(), false), XML(new ATLAS_XML_LOW_RES())); -
[SOLVED] make fails but makepkg works fine (when building zathura git)
Hello,
when I build the "zathura-girara-git" AUR package with makepkg everything works fine, but if I try to build zathura after cloning the git repo make fails with some errors.
make output from makepkg:
zathura build options:
CFLAGS = -march=native -O2 -pipe -fstack-protector --param=ssp-buffer-size=4 -D_FORTIFY_SOURCE=2 -std=c99 -pedantic -Wall -Wno-format-zero-length -Wextra -pthread -I/usr/include/gtk-2.0 -I/usr/lib/gtk-2.0/include -I/usr/include/atk-1.0 -I/usr/include/cairo -I/usr/include/gdk-pixbuf-2.0 -I/usr/include/pango-1.0 -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -I/usr/include/pixman-1 -I/usr/include/freetype2 -I/usr/include/libpng14
LIBS = -lgirara-gtk2 -pthread -lgtk-x11-2.0 -lgdk-x11-2.0 -latk-1.0 -lgio-2.0 -lpangoft2-1.0 -lpangocairo-1.0 -lgdk_pixbuf-2.0 -lcairo -lpango-1.0 -lfreetype -lfontconfig -lgobject-2.0 -lgmodule-2.0 -lgthread-2.0 -lrt -lglib-2.0 -ldl -lpthread -lm
DFLAGS = -g
CC commands.c
CC = cc
CC document.c
CC render.c
CC zathura.c
CC completion.c
CC bookmarks.c
CC utils.c
CC shortcuts.c
CC config.c
CC callbacks.c
CC print.c
CC database-plain.c
CC -o zathura
installing executable file
installing header files
installing manual pages
which: no rst2man in (/usr/lib/ccache/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/bin/vendor_perl:/usr/bin/core_perl:/usr/local/bin/:/home/bexie/bin)
installing desktop file
installing pkgconfig file
make output:
zathura build options:
CFLAGS = -std=c99 -pedantic -Wall -Wno-format-zero-length -Wextra -pthread -I/usr/include/gtk-2.0 -I/usr/lib/gtk-2.0/include -I/usr/include/atk-1.0 -I/usr/include/cairo -I/usr/include/gdk-pixbuf-2.0 -I/usr/include/pango-1.0 -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -I/usr/include/pixman-1 -I/usr/include/freetype2 -I/usr/include/libpng14
LIBS = -lgirara-gtk2 -pthread -lgtk-x11-2.0 -lgdk-x11-2.0 -latk-1.0 -lgio-2.0 -lpangoft2-1.0 -lpangocairo-1.0 -lgdk_pixbuf-2.0 -lcairo -lpango-1.0 -lfreetype -lfontconfig -lgobject-2.0 -lgmodule-2.0 -lgthread-2.0 -lrt -lglib-2.0 -lsqlite3 -ldl -lpthread -lm
DFLAGS = -g
CC = cc
CC commands.c
In file included from commands.c:3:0:
commands.h:7:20: fatal error: girara.h: No such file or directory
compilation terminated.
make: *** [commands.o] Error 1
I tried exporting the enviroment variables from /etc/makepkg.conf, to no avail. I also tried adding the girara include files to CFLAGS, but it throws the same kind of error about some gtk dependency.
I don't know if this is a general issue, but as I can compile it through makepkg I don't think this is due to zathura.
Does anyone have a clue on what could be wrong here?
Last edited by donbex (2012-01-22 12:56:59)In the PKGBUILD, they are cloning the repo by doing :
_gitroot="git://pwmt.org/zathura.git"
_gitname="zathura"
git clone $_gitroot
cd $_gitname && git checkout --track -b develop origin/develop
Are you doing the same ? -
When using DDX to merge certain pdf files, we get the following error:
failed: DDXM_S18005: An error occurred in the PrepareTOC phase while building <TableOfContents>. Cause given.
We are running Coldfusion 9.0.1 on a Windows Server 2008 R2 64-bit server.
The error occurs only on certain pdf files, and only if we include them in the table of contents.
There was a similar bug in Coldfusion 8 that was fixed, but it looks like it's come back again in Coldfusion 9.
merge.cfm:
<CFLOOP LIST="merge_no_toc.ddx,merge_toc.ddx" INDEX="ddx">
<CFLOOP LIST="good.pdf,bad.pdf" INDEX="pdf">
<CFSET inputStruct = StructNew()>
<CFSET StructInsert(inputStruct, "In1", pdf)>
<CFSET StructInsert(inputStruct, "In2", pdf)>
<CFSET outputStruct = StructNew()>
<CFSET outputStruct.Out1 = "merged.pdf">
<CFPDF ACTION="processddx" DDXFILE="#ddx#" INPUTFILES="#inputStruct#" OUTPUTFILES="#outputStruct#" NAME="merge"></CFPDF>
<CFOUTPUT>#DDX# with #pdf#:</CFOUTPUT>
<CFDUMP VAR="#merge#">
<BR />
</CFLOOP>
</CFLOOP>
merge_toc.ddx
<?xml version="1.0" encoding="UTF-8" ?>
<DDX xmlns="http://ns.adobe.com/DDX/1.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://ns.adobe.com/DDX/1.0/ coldfusion_ddx.xsd">
<PDF result="Out1">
<TableOfContents bookmarkTitle="Table Of Contents" maxBookmarkLevel="1" includeInTOC="false">
<Header>
<Center>
<StyledText>
<p color="black" font-size="16pt">Table of Contents</p>
<p />
<p />
</StyledText>
</Center>
</Header>
</TableOfContents>
<PDF source="In1" bookmarkTitle="1" includeInTOC="true">
<PageLabel mode="Define" start="_PageNumber"/>
</PDF>
<PDF source="In2" bookmarkTitle="2" includeInTOC="true">
<PageLabel mode="Define" start="_PageNumber"/>
</PDF>
</PDF>
</DDX>
merge_no_toc.ddx
<?xml version="1.0" encoding="UTF-8" ?>
<DDX xmlns="http://ns.adobe.com/DDX/1.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://ns.adobe.com/DDX/1.0/ coldfusion_ddx.xsd">
<PDF result="Out1">
<TableOfContents bookmarkTitle="Table Of Contents" maxBookmarkLevel="1" includeInTOC="false">
<Header>
<Center>
<StyledText>
<p color="black" font-size="16pt">Table of Contents</p>
<p />
<p />
</StyledText>
</Center>
</Header>
</TableOfContents>
<PDF source="In1" bookmarkTitle="1" includeInTOC="false">
<PageLabel mode="Define" start="_PageNumber"/>
</PDF>
<PDF source="In2" bookmarkTitle="2" includeInTOC="false">
<PageLabel mode="Define" start="_PageNumber"/>
</PDF>
</PDF>
</DDX>
The resulting output:
The "bad" pdf is a perfectly normal PDF with the Helvetica font (this is what the copier does).
The "good" pdf is a perfectly normal PDF from a different copier, without the Helvetica font.
The fonts available, as listed by the ColdFusion Administration page, include ArialMT and Helvetica:
The good and bad pdf files, as well as the ddx and cfm files can be downloaded here if you want to run it yourself: http://dl.dropbox.com/u/55552656/merge.ziphttps://bugbase.adobe.com/index.cfm?event=bug&id=3192782
I learned about the 8.0.1 part from here:
http://www.designovermatter.com/post.cfm/failed-ddxm-s18005-an-error-occurred-in-the-prepa retoc-phase-while-building-tableofcontents -
CLDC 1.1 make fails with invalid storage class
I am unable to build CLDC 1.1
I went into the j2me_cldc/build/linux directory and did this.
make clean
makeI get this error.
make[3]: Leaving directory `/opt/j2me_cldc/tools/jcc'
make[2]: Leaving directory `/opt/j2me_cldc/tools/jcc'
... obj/nativeFunctionTableUnix.o
... obj/events.o
../../../kvm/VmCommon/src/events.c: In function 'InitializeEvents':
../../../kvm/VmCommon/src/events.c:65: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmCommon/src/events.c: In function 'getKVMEvent':
../../../kvm/VmCommon/src/events.c:141: warning: unused variable 'currentTime'
... obj/resource.o
../../../kvm/VmExtra/src/resource.c: In function 'Java_com_sun_cldc_io_ResourceInputStream_open':
../../../kvm/VmExtra/src/resource.c:52: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmExtra/src/resource.c:54: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmExtra/src/resource.c: In function 'Java_com_sun_cldc_io_ResourceInputStream_close':
../../../kvm/VmExtra/src/resource.c:84: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmExtra/src/resource.c: In function 'Java_com_sun_cldc_io_ResourceInputStream_read':
../../../kvm/VmExtra/src/resource.c:107: warning: dereferencing type-punned pointer will break strict-aliasing rules
... obj/verifierUtil.o
../../../kvm/VmCommon/src/verifierUtil.c: In function 'matchStackMap':
../../../kvm/VmCommon/src/verifierUtil.c:426: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmCommon/src/verifierUtil.c: In function 'verifyClass':
*../../../kvm/VmCommon/src/verifierUtil.c:547: error: invalid storage class for function 'Vfy_verifyMethod'*
../../../kvm/VmCommon/src/verifierUtil.c:571: warning: implicit declaration of function 'Vfy_verifyMethod'
../../../kvm/VmCommon/src/verifierUtil.c:604: warning: dereferencing type-punned pointer will break strict-aliasing rules
../../../kvm/VmCommon/src/verifierUtil.c: At top level:
*../../../kvm/VmCommon/src/verifierUtil.c:1595: error: static declaration of 'Vfy_verifyMethod' follows non-static declaration*
*../../../kvm/VmCommon/src/verifierUtil.c:571: error: previous implicit declaration of 'Vfy_verifyMethod' was here*
And there are more similar errors for Vfy_checkNewInstructions, Vfy_checkNewInstructions.
../../../kvm/VmCommon/src/verifierUtil.c:1633: error: previous implicit declaration of 'Vfy_checkNewInstructions' was here
make[1]: *** [obj/verifierUtil.o] Error 1
make[1]: Leaving directory `/opt/j2me_cldc/kvm/VmUnix/build'
make: *** [all] Error 1
Line 547 is: static int Vfy_verifyMethod(METHOD vMethod);
Please help me to build this.
--Please help me to understand Caution -section
Caution:
If you do not open with the RESETLOGS option,
then two copies of an archived redo log for a given log sequence number may
exist--even though these two copies have completely different contents.
For example, one log may have been created on the original host and the other on the new host.
If you accidentally confuse the logs during a media recovery,
then the database will be corrupted but Oracle and RMAN cannot detect the problem.As per my understanding it says. If you don't open database with RESETLOGS option then there may be archived logs with log sequence number which is already archived on the source host. This may happen due to difference in RECIDs. Now when the database needs media recovery for this particular log sequence, you may provide any of them. So in this case, RMAN and Oracle will not be able to differentiate the two files and can accept any of the archived log files during recovery. Since the contents of two archived logs are different, because they are generated at different times and they contains different transactions. So, internally it corrupts your database.
Rgds. -
Release build error with ANE "AOT Compilation has failed while optimizing function"
I'm trying to use the ChartBoost ANE which can be downloaded here https://github.com/freshplanet/ANE-Chartboost
When I do an Ad-hoc release build It keeps crashing and giving me the error below when packaging for iOS. But if I do a debug build that's a "fast" or "standard" build it compiles fine, and I can even see the ads.
Why would it crash when doing a release build? Does this ANE need an update?
I'm using FB 4.7 and AIR 3.8. Specifically, Apache Flex 4.10 Air 3.8
If I remove this 1 line of the Chartboost ANE code "AirChartboost.getInstance().showInterstitial();", the release build compiles fine and I can put it on the test iOS device.
"Error occurred while packaging the application:
AOT Compilation has failed while optimizing function public::AdsHelper.showFullscreenAd
Exception in thread "main" java.lang.NullPointerException
at java.util.TreeMap.getEntry(TreeMap.java:324)
at java.util.TreeMap.containsKey(TreeMap.java:209)
at adobe.abc.GlobalOptimizer.sccp_eval(GlobalOptimizer.java:6150)
at adobe.abc.GlobalOptimizer.sccp_analyze(GlobalOptimizer.java:6019)
at adobe.abc.GlobalOptimizer.sccp(GlobalOptimizer.java:4733)
at adobe.abc.GlobalOptimizer.optimize(GlobalOptimizer.java:3615)
at adobe.abc.GlobalOptimizer.optimize(GlobalOptimizer.java:2309)
at adobe.abc.LLVMEmitter.optimizeABCs(LLVMEmitter.java:534)
at adobe.abc.LLVMEmitter.generateBitcode(LLVMEmitter.java:343)
at com.adobe.air.ipa.AOTCompiler.convertAbcToLlvmBitcodeImpl(AOTCompiler.java:611)
at com.adobe.air.ipa.BitcodeGenerator.main(BitcodeGenerator.java:104)
Compilation failed while executing : ADT"I just encountered this error when building ad-hoc package, say that:
Error occurred while packaging the application:
AOT Compilation has failed while optimizing function apackage::AClass.aFunction
After several tries It turns out that the error was because the function aFunction has a return before some other code (for testing). I commented the code after the return then the packaging works fine.
FYI. -
Error while building executable
Hi,
while building an executable I get the following error:
Error 6 occurred at Librarian Path Location.vi Can't list parent path
Possible reason(s):
LabVIEW: Generic file I/O error.
NI-488: I/O operation aborted.
If I try to build the executable of the initial version of the same VI, staying in the same file system
of the same machine, everyting can be build fine
Thanks for your helpI feel your pain. Compiling and building installations in LV is NOT intuitive. My guess is you have failed to install a necessary LLB or driver in your compiling or installation process. You will need to make sure other computers the program is on have the lv runtime installed and you may need to select other options to install during the installation build listed under "additional installers" tab such as NI-488 or NI_VISA in order to enable communications to external instruments. It works on the development computer because those things are already installed. You might do a search of "Error 6" here in the lv forum as I recall it's kind of a common problem.
-
IP Resource Errors While Building a Windows Failover Cluster on Azure
I have setup a failover cluster in Microsoft Azure on four VMs. I have two nodes in one subnet/region, one node in another subnet/region and a final node in a third subnet/region. The cluster passed the validation wizard and was successfully built, but the
two nodes in other subnets will not go online, reporting "Failed to bring the resource 'IP Address x.x.x.x' online".
Azure Vnet Setup
Vnet 1 - South Central US - 10.16.16.0/16 - Nodes 1 & 2 - Online
Vnet 2 - West US - 10.116.16.0/16 - Node 3 - Offline
Vnet 3 - East US - 10.216.16.0/16 - Node 4 - Offline
All three IP Address resources are DHCP. I've attempted to make them static but they still failed while coming online. If I click "Information Details" on the error I get:
Error code: 0x80071397
The operation failed because either the specified cluster node is not the owner of the resource, or the node is not a possible owner of the resource.
It seems to be an issue with the way Azure DHCP works, but I'm stalled. There must be a way to get his working. What am I missing?Hi Michael Lapidakis,
Starting in Windows Server 2008, a feature called an “OR dependency” was added to Failover Clustering which gave admins the ability to have nodes with a Network Name that
could use one IP Address OR another IP Address .In a multisite cluster, it is expected that 1 IP address is offline as it is an OR relationship.
You can refer the following related article to realize more detail settings of multi-site site cluster IP settings.
IP Resource Errors While Building a Windows Failover Cluster on Azure
https://social.technet.microsoft.com/Forums/en-US/ca2cbf5e-abef-4c23-9cff-7b6ca44acc23/ip-resource-errors-while-building-a-windows-failover-cluster-on-azure?forum=winserverClustering
I’m glad to be of help to you!
We
are trying to better understand customer views on social support experience, so your participation in this
interview project would be greatly appreciated if you have time.
Thanks for helping make community forums a great place. -
AIR 3.8, iOS Packaging Failed: "Compilation failed while executing : ld64"
Since upgrading to AIR 3.8, we can no longer package our AIR Application.
We are getting the following error:
internal_package-ios:
[exec] ld: -pie can only be used when targeting iOS 4.2 or later
[exec] Compilation failed while executing : ld64
Screenshot Here: http://cl.ly/image/3h2U1g271n1J
Reverting to AIR 3.7 fixes the issue... but we need 3.8 as we need the 4096 texture support that comes with BASELINE_EXTENDED profile.
From googling, this seems like it's related to the iOS SDK included with AIR, but the issue was fixed in AIR 3.7. Seems it has been regressed...Hi Shawn,
Since AIR 3.8, we support only PIE enabled binaries, as per Apple recommendation.
For this, the minimum supported iOS version is 4.2. The ANE you are using specifies a minimum iOS version of 4.0 in its platform-iphone.xml, present at: https://github.com/alebianco/ANE-Google-Analytics/blob/master/build/platform-iphone.xml
You will need to change the line:
<option>-ios_version_min 4.0</option>
to:
<option>-ios_version_min 4.2</option>
and rebuild the ANE to make it work.
Regards,
Neha -
[SOLVED] dkms think "make" fails despite that it exits successfully.
Greetings. I have a laptop with my own kernel module that I update every now and then.
So far I made it compile just fine.
Next I did create a PKGBUILD and a dkms config for it. It worked flawlessly as well, and it has been working for a while.
But recently, with the latest updates. dkms starts to spit out errors about my dkms package.
I go in and check what's up.
Now some strange things starts happening:
if I try to make dkms to update my module, it ends with an error:
Command:
sudo dkms install asus_oled-sg/r103 -k $(uname -r)
Output:
Kernel preparation unnecessary for this kernel. Skipping...
Building module:
cleaning build area....
make KERNELRELEASE=3.17.4-1-ARCH -C /var/lib/dkms/asus_oled-sg/r103/build/....
Error! Build of asus_oled-sg.ko failed for: 3.17.4-1-ARCH (x86_64)
Consult the make.log in the build directory
/var/lib/dkms/asus_oled-sg/r103/build/ for more information.
The content of /var/lib/dkms/asus_oled-sg/r103/build/make.log :
DKMS make.log for asus_oled-sg-r103 for kernel 3.17.4-1-ARCH (x86_64)
Sun 14 Dec 15:22:45 CET 2014
make: Entering directory '/var/lib/dkms/asus_oled-sg/r103/build'
make[1]: Entering directory '/usr/lib/modules/3.17.4-1-ARCH/build'
CC [M] /var/lib/dkms/asus_oled-sg/r103/build/asus_oled.o
In file included from include/linux/thread_info.h:11:0,
from ./arch/x86/include/asm/preempt.h:6,
from include/linux/preempt.h:18,
from include/linux/spinlock.h:50,
from include/linux/seqlock.h:35,
from include/linux/time.h:5,
from include/linux/stat.h:18,
from include/linux/module.h:10,
from /var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c:39:
include/linux/bug.h:33:45: warning: initialization from incompatible pointer type
#define BUILD_BUG_ON_ZERO(e) (sizeof(struct { int:-!!(e); }))
^
include/linux/kernel.h:849:3: note: in expansion of macro ‘BUILD_BUG_ON_ZERO’
BUILD_BUG_ON_ZERO((perms) & 2) + \
^
include/linux/sysfs.h:75:12: note: in expansion of macro ‘VERIFY_OCTAL_PERMISSIONS’
.mode = VERIFY_OCTAL_PERMISSIONS(_mode) }, \
^
include/linux/device.h:426:46: note: in expansion of macro ‘__ATTR’
struct class_attribute class_attr_##_name = __ATTR(_name, _mode, _show, _store)
^
/var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c:765:8: note: in expansion of macro ‘CLASS_ATTR’
static CLASS_ATTR(version, S_IRUGO, version_show, NULL);
^
include/linux/bug.h:33:45: warning: (near initialization for ‘class_attr_version.show’)
#define BUILD_BUG_ON_ZERO(e) (sizeof(struct { int:-!!(e); }))
^
include/linux/kernel.h:849:3: note: in expansion of macro ‘BUILD_BUG_ON_ZERO’
BUILD_BUG_ON_ZERO((perms) & 2) + \
^
include/linux/sysfs.h:75:12: note: in expansion of macro ‘VERIFY_OCTAL_PERMISSIONS’
.mode = VERIFY_OCTAL_PERMISSIONS(_mode) }, \
^
include/linux/device.h:426:46: note: in expansion of macro ‘__ATTR’
struct class_attribute class_attr_##_name = __ATTR(_name, _mode, _show, _store)
^
/var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c:765:8: note: in expansion of macro ‘CLASS_ATTR’
static CLASS_ATTR(version, S_IRUGO, version_show, NULL);
^
In file included from include/linux/printk.h:5:0,
from include/linux/kernel.h:13,
from /var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c:38:
/var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c: In function ‘__inittest’:
include/linux/init.h:329:4: warning: return from incompatible pointer type
{ return initfn; } \
^
/var/lib/dkms/asus_oled-sg/r103/build/asus_oled.c:805:1: note: in expansion of macro ‘module_init’
module_init(asus_oled_init);
^
Building modules, stage 2.
MODPOST 1 modules
CC /var/lib/dkms/asus_oled-sg/r103/build/asus_oled.mod.o
LD [M] /var/lib/dkms/asus_oled-sg/r103/build/asus_oled.ko
make[1]: Leaving directory '/usr/lib/modules/3.17.4-1-ARCH/build'
make: Leaving directory '/var/lib/dkms/asus_oled-sg/r103/build'
So it contains a few notes and warnings, but it compiles just fine. It has generated a .ko file.
Yet dkms thinks "make" failed. Why?
If try this, it works:
cd /var/lib/dkms/asus_oled-sg/r103/build/
sudo make clean
sudo make
No errors, same set of notes and warnings. exit status 0, as it should be.
Then why do dkms think it has failed?
This is my dkms.conf file:
PACKAGE_NAME="@_PKGBASE@"
PACKAGE_VERSION="@PKGVER@"
MAKE[0]='make -C /var/lib/dkms/@_PKGBASE@/@PKGVER@/build/'
CLEAN='make -C /var/lib/dkms/@_PKGBASE@/@PKGVER@/build/ clean'
BUILT_MODULE_NAME[0]="@_PKGBASE@"
DEST_MODULE_LOCATION[0]="/extra"
AUTOINSTALL="yes"
Last edited by SysGhost (2014-12-15 10:05:47)Well I finally solved it.
I had to make some adjustments in the given examples at https://wiki.archlinux.org/index.php/Dy … le_Support
Looks like I tried to overdo it. Old versions of PKGBUILD and dkms.conf that previously worked, led me astray.
This is how the files now look like: (slightly different from what the wiki page above gives. I had to make some adjustments.)
PKGBUILD:
_pkgbase=asus_oled
pkgname=asus_oled-dkms
pkgver=r103
pkgrel=6
pkgdesc="Driver for small OLED displays found in some older Asus laptops. Ugly-patched for >3.14 kernels."
url="http://lapsus.berlios.de/asus_oled.html"
arch=('x86_64' 'i686')
license=('GPL2')
depends=('dkms')
makedepends=('sed')
provides=("${_pkgbase}")
conflicts=("${_pkgbase}")
install="${pkgname}.install"
source=("${pkgname}.tar.gz"
"dkms.conf")
md5sums=('SKIP'
'SKIP')
build() {
cd "${_pkgbase}-${pkgver}"
make clean
make
rm modules.order
package() {
# Prepare Makeile
sed -i 's/depmod/\#depmod/g' "${_pkgbase}-${pkgver}"/Makefile
# Install
msg2 "Starting make install..."
make -C "${_pkgbase}-${pkgver}" PREFIX="${pkgdir}/usr" DESTDIR="${pkgdir}/usr" INSTALL_MOD_PATH="/usr/lib/modules/$(uname -r)/extra" install
# Copy dkms.conf
install -Dm644 dkms.conf "${pkgdir}"/usr/src/${_pkgbase}-${pkgver}/dkms.conf
# Set name and version
sed -e "s/@_PKGBASE@/${_pkgbase}/" \
-e "s/@PKGVER@/${pkgver}/" \
-i "${pkgdir}"/usr/src/${_pkgbase}-${pkgver}/dkms.conf
# Copy sources (including Makefile)
cp -r "${_pkgbase}-${pkgver}"/* "${pkgdir}"/usr/src/${_pkgbase}-${pkgver}/
dkms.conf:
PACKAGE_NAME="@_PKGBASE@"
PACKAGE_VERSION="@PKGVER@"
MAKE[0]='make'
CLEAN='make clean'
BUILT_MODULE_NAME[0]="@_PKGBASE@"
DEST_MODULE_LOCATION[0]="/extra"
AUTOINSTALL="yes"
asus_oled-dkms.install:
post_install() {
dkms install asus_oled/${1%%-*}
pre_upgrade() {
pre_remove ${2%%-*}
post_upgrade() {
post_install ${1%%-*}
pre_remove() {
dkms remove asus_oled/${1%%-*} --all
Seems this was more like a package problem than a problem with the kernel module itself, as I first thought. Hence the thread placement.
I am now marking this as solved.
Maybe you are looking for
-
How to do this in select?
CREATE TABLE ITEM_IDENTIFIER ID NUMBER(19) NOT NULL, IDENTIFIER VARCHAR2(255 BYTE) NOT NULL, PRIORITY_RANK NUMBER(10), ITEM_ID NUMBER(19) NOT NULL, IDENTIFIER_TYPE_ID NUMBER(19) NOT NULL Insert into ITEM_IDENTIFIER(identifier,priority_rank,item_id,id
-
Report erroring out with 'Invalid parameter requested' in Siebel.
Hello - I have 9 parameters on a template, excluding the two delivered out-of-the-box by siebel to enter the Language and Locale. Recently, if i enter a language value of 'ESP' ( or any other language value), it errors out with 'Invalid parameters re
-
HI All, I am trying to upload the trial balance with Web ADI.Its working firn for 4 records but for 2000 records, its not uploading and even not showing any error msg.Its just terminating the loader. Can somebody help ????? Thanks in advance Abhishek
-
Acrobat 7 - Excel Doc saves as two files
Hi, I have a user who has a complex custom Excel sheet. When she selects to print the document to PDF, it saves as two files. First it asks her to save the file - she does so. Then a second save window opens and asks her to save again - she need to c
-
Obiee 11g Publisher Bursting Report error
Hi I am using Obiee 11g (ver. 11.1.1.3.0) and trying to achieve report bursting. I have created Data model and Report. Report is showing result but once I schedule report by bursting to file, 1. I am getting error (ORA-00923: FROM keyword not found w