Task #2763 (closed)
Opened 14 years ago
Closed 14 years ago
BUG: Cannot retrieve the free space Feedback 2678
Reported by: | omero-qa | Owned by: | cxallan |
---|---|---|---|
Priority: | minor | Milestone: | OMERO-Beta4.2.1 |
Component: | from QA | Version: | 4.1 |
Keywords: | n.a. | Cc: | p.schregle@…, jburel |
Resources: | n.a. | Referenced By: | n.a. |
References: | n.a. | Remaining Time: | n.a. |
Sprint: | n.a. |
Description (last modified by atarkowska)
http://qa.openmicroscopy.org.uk/qa/feedback/2678/
Comment: Opening Disk Space when no data was there
This problem is related to http://loci.wisc.edu/trac/java/ticket/543
Change History (6)
comment:1 Changed 14 years ago by atarkowska
- Cc jburel added
- Description modified (diff)
- Milestone changed from Unscheduled to OMERO-Beta4.2.1
comment:2 Changed 14 years ago by atarkowska
Traceback (most recent call last): File "/home/omero/omero_dist/lib/python/django/core/handlers/base.py", line 92, in get_response response = callback(request, *callback_args, **callback_kwargs) File "/home/omero/omero_dist/lib/python/omeroweb/webadmin/views.py", line 204, in wrapped return f(request, *args, **kwargs) File "/home/omero/omero_dist/lib/python/omeroweb/webadmin/views.py", line 990, in drivespace controller = BaseDriveSpace(conn) File "/home/omero/omero_dist/lib/python/omeroweb/webadmin/controller/drivespace.py", line 35, in __init__ self.freeSpace = self.conn.getFreeSpace() File "/home/omero/omero_dist/lib/python/omeroweb/extlib/gateway.py", line 262, in getFreeSpace return rep_serv.getFreeSpaceInKilobytes() * 1024 File "/home/omero/omero_dist/lib/python/omero/gateway/__init__.py", line 1841, in wrapped return inner(*args, **kwargs) File "/home/omero/omero_dist/lib/python/omero/gateway/__init__.py", line 1774, in inner return f(*args, **kwargs) File "/home/omero/omero_dist/lib/python/omero_api_IRepositoryInfo_ice.py", line 89, in getFreeSpaceInKilobytes return _M_omero.api.IRepositoryInfo._op_getFreeSpaceInKilobytes.invoke(self, ((), _ctx)) ResourceError: exception ::omero::ResourceError { serverStackTrace = ome.conditions.ResourceError: Cannot run program "df": java.io.IOException: error=24, Too many open files at ome.logic.RepositoryInfoImpl.getFreeSpaceInKilobytes(RepositoryInfoImpl.java:164) at sun.reflect.GeneratedMethodAccessor374.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at ome.security.basic.EventHandler.invoke(EventHandler.java:144) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:111) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:108) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:222) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:111) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at $Proxy66.getFreeSpaceInKilobytes(Unknown Source) at sun.reflect.GeneratedMethodAccessor374.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:83) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:40) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at $Proxy66.getFreeSpaceInKilobytes(Unknown Source) at sun.reflect.GeneratedMethodAccessor385.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:179) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:132) at ome.services.blitz.impl.RepositoryInfoI.getFreeSpaceInKilobytes_async(RepositoryInfoI.java:42) at omero.api._IRepositoryInfoTie.getFreeSpaceInKilobytes_async(_IRepositoryInfoTie.java:64) at omero.api._IRepositoryInfoDisp.___getFreeSpaceInKilobytes(_IRepositoryInfoDisp.java:132) at omero.api._IRepositoryInfoDisp.__dispatch(_IRepositoryInfoDisp.java:218) at IceInternal.Incoming.invoke(Incoming.java:159) at Ice.ConnectionI.invokeAll(ConnectionI.java:2037) at Ice.ConnectionI.message(ConnectionI.java:972) at IceInternal.ThreadPool.run(ThreadPool.java:577) at IceInternal.ThreadPool.access$100(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:971) serverExceptionClass = ome.conditions.ResourceError message = Cannot run program "df": java.io.IOException: error=24, Too many open files } path:/webadmin/drivespace/, GET:, POST:, COOKIES:{'sessionid': '464ceee26b1d31fe5d0b4a78baee48b7'}, META:{'AUTH_TYPE': None, 'CONTENT_LENGTH': 0, 'CONTENT_TYPE': None, 'GATEWAY_INTERFACE': 'CGI/1.1', 'HTTP_ACCEPT': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 'HTTP_ACCEPT_CHARSET': 'ISO-8859-1,utf-8;q=0.7,*;q=0.7', 'HTTP_ACCEPT_ENCODING': 'gzip,deflate', 'HTTP_ACCEPT_LANGUAGE': 'en-us,en;q=0.5', 'HTTP_CONNECTION': 'keep-alive', 'HTTP_COOKIE': 'sessionid=464ceee26b1d31fe5d0b4a78baee48b7', 'HTTP_HOST': '141.52.175.71', 'HTTP_KEEP_ALIVE': '115', 'HTTP_REFERER': 'http://141.52.175.71/webadmin/experimenters/', 'HTTP_USER_AGENT': 'Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8', 'PATH_INFO': u'/webadmin/drivespace/', 'PATH_TRANSLATED': None, 'QUERY_STRING': None, 'REMOTE_ADDR': '134.36.64.132', 'REMOTE_HOST': None, 'REMOTE_IDENT': None, 'REMOTE_USER': None, 'REQUEST_METHOD': 'GET', 'SCRIPT_NAME': '', 'SERVER_NAME': '127.0.0.1', 'SERVER_PORT': 80, 'SERVER_PROTOCOL': 'HTTP/1.1', 'SERVER_SOFTWARE': 'mod_python'}>
comment:3 Changed 14 years ago by atarkowska
- Description modified (diff)
comment:4 Changed 14 years ago by atarkowska
- Description modified (diff)
- Owner changed from atarkowska to mlinkert-x
comment:5 Changed 14 years ago by jmoore
- Owner changed from mlinkert-x to cxallan
This is not a bioformats issue, per-se, but rather an instance of the server running out of file handles. Chris recently responded to an email from Peter (http://lists.openmicroscopy.org.uk/pipermail/ome-devel/2010-September/001702.html), so giving this to him. If all the odds and ends are taken care of, this ticket can be closed.
comment:6 Changed 14 years ago by cxallan
- Resolution set to fixed
- Status changed from new to closed
Has been solved by addressing the issues with file handle leakage in #2764.